This is a very simple question. I noticed that the following, when compiled in MSVS2012, produces the expected result of 0x3412
for val
:
unsigned char test[] = { 0x12, 0x34, 0x56, 0x78 };
unsigned char* ch = test;
unsigned int val = *ch | (*(ch+1) << 8);
I would have actually expected the dereferenced char pointer *(ch+1)
on the right to produce a char value of 0x34
, which would then be shifted left 8 bits producing 0x00
. It seems that at the point in time the value is dereferenced, it is already stored in a type large enough to contain at least two bytes.
Is this specified in the C++ standard somewhere? How exactly does this implicit cast happen?