By rules of mixed signed-unsigned comparisons, a == b
is equivalent to a == (unsigned) b
, i.e. the comparison is performed in the domain of unsigned type.
The result of ~0
is all-ones bit pattern. In signed integer type this pattern represents -1
on a 2's-complement platform. Which means that you initialized your b
with -1
(as confirmed by your printf
).
So, your comparison is effectively (unsigned) -1 == (unsigned) -1
. No wonder it holds true.
But keep in mind that the equality is still implementation-dependent, since it depends on the properties of 2's-complement representation. As long as C language officially supports alternative signed integer representations (sign and magnitude, 1's-complement) the equality will depend on it.