int i = 12;
decltype(i) x4; // type is int
decltype((i)) x5; // type is int&
We know from C++ fundementals that i
is lvaue already. now decltype(i)
is equal to type of expression i
which is natually int
. Now as per spec/standard (i)
is also an lvalue and decltype((i))
should also in some way inspect type of expression (i)
which is lvalue and is same as i
which is also lvalue. But why the standard suddenly requires T&
type ?
This is very confusing. because if it was decltype((i)&)
then I agree this means decltype
of int&
would have been much more clear and precise. I am a bit lost here as to why i
and (i)
both of which resolve to lvalue type result in different types of T
and T&
.
Aucun commentaire:
Enregistrer un commentaire