In Javascript, if I do the bitwise NOT operation on the decimal integer 10:
~10
I expect it to compute bitwise NOT on the binary integer as follows:
~1010 = 0101
In otherwords, I expected decimal integer 5. Instead, the operation gives me -11. (try it in your console)
~10 = -11
If I check that more explicitly by looking at -11 and ~10 as binary integer strings:
parseInt(~10,10).toString(2)
"-1011"
parseInt(-11,10).toString(2)
"-1011"
Consistent. But I don't understand. Can anyone explain to me why? I'm guessing that it is something to do with the sign.
EDIT: I found this question after posting, it also helped me understand this phenomena much better.