We know that signed char
can have values only from -128
to 127
.
but when we run the below program no overflow happens even though the l output exceeds the range of signed character.
#include <stdio.h>
int main()
{
char i = 60;
char j = 30;
char k = 10;
char l = (i*j)/k;
printf("%d ", l);
return 0;
}
the output of
l
is180
which is out of range forchar l
, but i am not getting any error.
in other scenario if we take the same program but instead of arithmetic function if we simply put l=180
and try to print it then we get wrong answer.
#include <stdio.h>
int main()
{
char i = 60;
char j = 30;
char k = 10;
char l = 180;
printf("%d ", l);
return 0;
}
the answer that i get in 2nd case is -76
.
can anyone explain it why? even if i am virtually executing the same thing but i am getting different result.
EDIT:
This is the classic example that the intermediate computations are made in int
and not char
so in 1st when i am doing computation it is taking all values in int and in the later part i am explicitly mentioning it as char.