In binary notation with two's complement, a
looks like this:
111..1110110
Now, a right shift is done. For signed integers it is implementation specific what value the filled bits have; GCC promises "sane behavior", that is, to do an arithmetic right shift - where the sign bit (here 1
) is extended.
An arithmetic right shift on an integer is dividing that integer by 2^n (n is the shift size) - no matter the sign.
Therefore the shift produces the new value:
111..1111011
Which is -5
. Flipping all bits and adding one yields 000...00101
, which is 5
.
A logical shift would have produced
011..1111011
which has the value 2147483643
for a 32-bit integer. Note how the value even depends on the size of the integer you performed the operation on.