I can read that int range (signed) is from [−32767, +32767] but I can say, for example
int a=70000;
int b=71000;
int c=a+b;
printf("%i", c);
return 0;
And the output is 141000 (correct). Should not the debugger tell me "this operation is out of range" or something similar?
I suppose that this has to be with me ignoring the basics of C programming, but none of the books that I'm currently reading tell nothing about this "issue".
EDIT: 2147483647 seems to be the upper limit, thank you. If a sum exceeds that number, the result is negative, wich is expected, BUT if it is a subtraction, for example: 2147483649-2147483647=2 the result is still good. I mean, why the value 2147483649 is correctly hold for that substraction purpose (or at least it seems to me)?