In the same way 1/3
cannot be exactly represented in decimal, 0.1
cannot be exactly represented in binary, and Javascript numbers are binary floating point values.
In Javascript 0.2 + 0.1
returns 0.30000000000000004
.
Try it in the browser console.
In effect, 53 bits are available to store the mantissa in a Javascript 64-bit floating point value, and the decimal value 0.1
in binary rounded to a precision of 53 bits is
0.00011001100110011001100110011001100110011001100110011010
which when converted back to decimal is exactly
0.1000000000000000055511151231257827021181583404541015625
.
We can show this using using toFixed
with Firefox (other browsers limit the argument to 20):
(0.1).toFixed(55)
returns
0.1000000000000000055511151231257827021181583404541015625
.
In the same way, the decimal value 0.2
in binary rounded to a precision of 53 bits and then converted back to decimal is exactly
0.200000000000000011102230246251565404236316680908203125
.
If we add the two binary representations of 0.1
and 0.2
, round to 53 bits and then convert back to decimal we get exactly
0.3000000000000000444089209850062616169452667236328125
.
So the result of 0.1
+ 0.2
in Javascript is not 0.3
but, to 17 decimal places, is 0.30000000000000004
.
In fact, 0.3
can't be exactly represented in binary itself anyway.
It is actually stored as the binary equivalent of the decimal value
0.29999999999999993338661852249060757458209991455078125
which is why in Javascript
0.2 + 0.1 == 0.3
returns false
.
Decimal to binary
A decimal number can only be represented exactly in binary if 2
is the only prime factor of the denominator of the number when it is expressed as a simple fraction in lowest terms.
For example,
0.1
is 1/10
, and 10
has prime factors 2
and 5
, so no exact representation.
0.5
is 1/2
, and the only prime factor of 2
is 2
so it can be represented exactly.