I am trying this code on GNU's C++ compiler and am unable to understand its behaviour:
#include <stdio.h>;
int main()
{
int num1 = 1000000000;
long num2 = 1000000000;
long long num3;
//num3 = 100000000000;
long long num4 = ~0;
printf("%u %u %u", sizeof(num1), sizeof(num2), sizeof(num3));
printf("%d %ld %lld %llu", num1, num2, num3, num4);
return 0;
}
When I uncomment the commented line, the code doesn't compile and is giving an error:
error: integer constant is too large for long type
But, if the code is compiled as it is and is executed, it produces values much larger than 10000000000.
Why?