I am trying to multiply decimal value with 10 in Javascript:
console.log(1000.56 * 10);
console.log(100.56 * 10);
It is printing.
10005.59999999999
1005.6
Why?
I am trying to multiply decimal value with 10 in Javascript:
console.log(1000.56 * 10);
console.log(100.56 * 10);
It is printing.
10005.59999999999
1005.6
Why?
As explained in this question, JavaScript’s default behavior for converting floating-point numbers to decimal (when converting to a string, for display, printing, or other purposes), is to use just enough digits to uniquely identify the floating-point value.
In 1000.56 * 10
, two rounding errors occur. The first occurs when 1000.56
is converted to floating point. The second occurs when the multiplication is performed. Two errors also occur in 100.56 * 10
. However, the errors happen to partially cancel (they are in opposite directions, as an effectively random consequence of where representable values happen to lie), and the result is close enough to 1005.6 that JavaScript’s algorithm for formatting used “1005.6” for the result.
In 1000.56 * 10
, the result was not the representable floating-point number closest to 10005.6, so JavaScript used additional digits to distinguish it.