I'm a C beginner, and I'm confused by the following example found in the C answer book.
One way to find the size of unsigned long long on your system is to type:
printf("%llu", (unsigned long long) ~0);
I have no idea why this syntax works?
On my system, int
are 32 bits, and long long
are 64 bits.
What I expected was that, since 0
is a constant of type integer, ~0
calculates the negation of a 32-bits integer, which is then converted to an unsigned long long
by the cast operator. This should give 232 - 1 as a result.
Somehow, it looks like the ~
operator already knows that it should act on 64 bits?
Does the compiler interprets this instruction as printf("%llu", ~(unsigned long long)0);
? That doesn't sound right since the cast and ~
have the same priority.