Below is my code to convert farenhiet to celsius scale:
#include<stdio.h>
#define CONVERT_TO_CELSIUS(far) (5.0/9.0)*(far-32.0)
int main()
{
int f;
float c;
int l=0,u=200,s=20;
f=l;
while(f<=u)
{
c = CONVERT_TO_CELSIUS(f);
printf("%3.0f\t%6.1f\n",f,c);
f=f+s;
}
return 1;
}
Output seen:
-18 32.00
-7 32.00
4 32.00
16 32.00
27 32.00
38 32.00
49 32.00
60 32.00
71 32.00
82 32.00
93 32.00
Output expected:
0 -17.78
20 -6.67
40 4.44
60 15.56
80 26.67
100 37.78
120 48.89
140 60.00
160 71.11
180 82.22
200 93.33
I am seeing that when I specify variable f's format specifier as %f the output is wrong , but when I retain it as %d then output is as expected. How code works here?