When I multiply two unsigned chars in C like this:
unsigned char a = 200;
unsigned char b = 200;
unsigned char c = a * b;
Then I know I will have an overflow, and I get (40'000 modulo 256) as a result. When I do this:
unsigned char a = 200;
unsigned char b = 200;
unsigned int c = (int)a * (int)b;
I will get the correct result 40'000. However, I do not know what happens with this:
unsigned char a = 200;
unsigned char b = 200;
unsigned int c = a * b;
Can I be sure the right thing happens? Is this compiler dependent? Similarly, I don't know what happens when doing a subtraction:
unsigned char a = 1;
unsigned char b = 2;
int c = a - b;
When making "c" an unsigned char, I will probably get 255 as a result. What happens when I use an int like this?