I am trying to perform division using floats in C. To demonstrate what I mean, you can run the following code.
#include <stdio.h>
#include <time.h>
#include <stdlib.h>
#define TEST 20.0 / 65536.0
int main() {
float a = 11.147;
printf("%f\n", a);
float b = 20.0 / 65536.0;
printf("%f\n", b);
float c = a/b;
printf("%f\n", c);
int d = (int) c;
printf("%d\n", d);
float e = a/(float) TEST;
printf("%f\n",e);
printf("%f\n", TEST);
return 0;
}
The code above gave the following results
11.147000
0.000305
**36526.492188**
36526
**0.000009**
0.000305
The value I highlighted should be the same because it is the results of the same formula. The only difference is I use #define
to define the divisor for the latter, which gave the incorrect value.
I am clueless as why this happens, can somebody please explain why I get this results? Thank you.