Curious about whether type size is a factor when removing bits using one's compliment operator ~
, I ended up with this curious result:
uint64_t x = 0xccccccccccccccccUL;
uint8_t y = 10;
auto z = ~y; // Debugger calls z an 'int' with value of 0xfffffff5
x &= z;
// x is now the correct value: 0xccccccccccccccc4
How can operator &
return a value greater than either of the operands?
The following example confuses me further, because it seems like the same logic but yields a different result:
uint64_t x = 0xccccccccccccccccUL;
x &= 0xfffffff5;
// Now x is 0x00000000ccccccc4
What explains this? Is it actually safe to use x &= ~y
to remove bits regardless of type size / sign?