When I enter the line "printf("%f\n", 5 / 2);" in lines 18 and 21 I don't get 2.5000... but 0.0000 and 65 and I don't understand why.
int main(void){int a = 65;
char c = (char)a;
int m = 3.0/2;
printf("%f\n", 5 / 2);
printf("%c\n", c); // output: A
printf("%f\n", (float)a); // output: 65.000000
printf("%f\n", 5 / 2);
printf("%f\n", 5.0 / 2); // output: 2.5000000
printf("%f\n", 5 / 2.0); // output: 2.5000000
printf("%f\n", (float)5 / 2); // output: 2.5000000
printf("%f\n", 5 / (float)2); // output: 2.5000000
printf("%f\n", (float)(5 / 2)); // output: 2.0000000 - we cast only after division and result was 2
printf("%f\n", 5.0 / 2); // output: 2.5000000
printf("%d\n", m); // output: 1
system("PAUSE");
return 0; }
the output is:
0.000000
A
65.000000
65.000000
2.500000
2.500000
2.500000
2.500000
2.000000
2.500000
1