Possible Duplicate:
Is JavaScript's Math broken?
Strange result with floating point addition
Some simple JavaScript calculations in Safari 5.0.6 but the same phenomenon in Firefox:
var i=0.1;
i=i+0.01; //= 0.11
i=i+0.01; //= 0.12
i=i+0.01; //= 0.13
i=i+0.01; //= 0.14
i=i+0.01; // expected == 0.15
console.log(i); // == 0.15000000000000002
Where does this imprecision come from?
Sure, I can handle it with i.toPrecision()
or other methods, but does it have to be like that? Is this a floating-point rounding error?
The same happens in this example:
var i=0.14+0.01; //expected == 0.15
console.log(i); //== 0.15000000000000002
What is happening between 0.14 and 0.15?
var i=0.1400001+0.01; //expected==0.1500001
console.log(i); //== 0.1500001 ok!
and
var i=0.14000001+0.01; //expected==0.15000001 !!
console.log(i); //== 0.15000001000000002
What do I have to do differently to get the correct results?