From what I understand, the famous
(0.1 + 0.2) !== 0.3
gotcha is not actually Javascript's fault. Thats just the way the IEEE 754 works. A similar output happens in Python, which also follows the IEEE 754 rules.
Then how come this particular example works as expected in C, sometimes. If I do a direct comparison
printf("%d\n", (0.1+0.2) == 0.3);
I get the (un?)expected output 0
, but if I put the values into variables or print them out, I get properly rounded answers.
Is the C implementation of IEEE 754 doing something extra? Or is it something completely else that I am missing.
Update
The code sample I posted was broken due to a typo. Try this one Fixed C Runnable Example
But the original Question still remains.
double d1, d2, d3;
d1 = 0.1; d2 = 0.2; d3 = d1 + d2;
printf ("%d\n", ((((double)0.1)+((double)0.2)) == ((double)d3)));
printf ("%.17f\n", d1+d2);
printf ("%d\n", ((d1+d2) == d3));
The output is
1
0.30000000000000004
1
The rephrased question now is:
Why (and when, and how) is the C compiler taking the liberty to say that
0.3 == 0.30000000000000004
Given all facts, isn't it true that the C implementation is broken, rather than Javascripts'?