I'm trying to divide an integer by a float using this simple code:
float result = 0;
int randomInt = [random integer between 100 and 500];
result = 323 / 10.00;
printf("result is: %f\n",result);
The output of printf is sometimes off by 0.00001 or so:
32.999998 //323/10.00
45.000001 //450/10.00
29.099998 //290/10.00
Why does this happen?