A c programming book that I'm reading(c programming, a modern approach 2nd edition) says that when an "overflow occurs during an operation on unsigned integers, though, the result is defined."
Here is a small code example
#include <stdio.h>
int main()
{
unsigned short int x = 65535; // The unsigned short int is at the maximum possible range
x += 1; // If I add one to it will overflow.
printf("%u", x); // the output will be zero or one if decide to add plus one again to x
return 0;
}
He then goes to say that "for signed integers, the behaviors for these integers are not defined". Meaning the program can either print out the incorrect result or it can crash the program.
Why is this so?