I am surprised by C++'s behavior when applying bit-wise not to an unsigned char.
Take the binary value 01010101b
, which is 0x55
, or 85
. Applying bit-wise not on an eight bit representation should yield 10101010b
, which is 0xAA
, or 170
.
However, I cannot reproduce the above in C++. The following simple assertion fails.
assert(static_cast<unsigned char>(0xAAu) == ~static_cast<unsigned char>(0x55u));
I printed the values of 0x55
, 0xAA
, and ~0x55
(as uchar) with the following code. And it reveals that the bit-wise not does not do what I expect it to do.
std::cout << "--> 0x55: " << 0x55u << ", 0xAA: " << 0xAAu << ", ~0x55: "
<< static_cast<unsigned>(~static_cast<unsigned char>(0x55u)) << std::endl;
--> 0x55: 85, 0xAA: 170, ~0x55: 4294967210
The number that is printed for ~0x55
is equal to 11111111111111111111111110101010b
, which is the 32-bit bit-wise not of 0x55
. So, the ~
operator is operating on 32-bit integers even if I explicitly cast the input to an unsigned char
. Why is that?
I applied another test to see what type the ~
operator returns. And it turns out to be int
on an unsigned char
input:
template <class T>
struct Print;
// inside main()
Print<decltype(~static_cast<unsigned char>(0x55))> dummy;
Yields the following compiler error, which indicates, that the result is of type int
.
error: implicit instantiation of undefined template 'Print<int>'
Print<decltype(~static_cast<unsigned char>(0x55u))> dummy;
What am I doing wrong? Or, how do I get C++ to produce 0xAA
from ~0x55
?
Full code is here