printf("%d\n", i);
invokes UB. i
is unsigned int
and you try to print it as signed int
. Writing 1 << 31
instead of 1U << 31
is undefined too.
Print it as:
printf("%u\n", i);
or
printf("%X\n", i);
About your updated question, it also invokes UB for the very same reasons (If you use '1U' instead of 1
, then for the reason that an int
is initialized with 1U << 31
which is out of range value. If an unsigned
is initialized with out of range value, modular arithmetic come into picture and remainder is assigned. For signed
the behavior is undefined.)
Understanding the behavior on your platform
On your platform, int appears to be 4 byte. When you write something like 1 << 31
, it converts to bit patters 0x80000000
on your machine.
Now when you try to print this pattern as signed, it prints signed interpretation which is -231 (AKA INT_MIN) in 2s completement system. When you print this as unsigned, you get expected 231 as output.
Learnings
1. Use 1U << 31
instead of 1 << 31
2. Always use correct print specifiers in printf.
3. Pass correct argument types to variadic functions.
4. Be careful when implicit typecast (unsigned -> signed, wider type -> narrow type) takes place. If possible, avoid such castings completely.