Consider following C code
int main() {
signed char result = 129;
printf("%d", result);
return 0;
}
Assuming char is 8 bit, 129 causes overflow, hence result of printf will be -127. I'm clear with this part.
But when i try the following:
int main() {
signed char a=100,b=3;
signed char c= 4;
signed char result = a*b/c;
printf("%d", result);
return 0;
}
Then as per the convention, as a
and b
are signed char
, a*b
will be a signed char
. Hence, 300
will overflow, so, only LSB(least significant 8 bits should be considered). This will make it 44
and when divided by 4
, it should print 11
and not 75
. I'm confused here.