Give a decimal number 0.2
EX
var theNumber= 0.2;
I ASSUME it would be stored in memory as (based on double-precision 64-bit floating point format IEEE 754)
0-01111111100-1001100110011001100110011001100110011001100110011001
That binary number is actually rounded to fit 64 bit.
If we take that value and convert it back to decimal, we will have
0.19999999999999998
(0.1999999999999999833466546306226518936455249786376953125)
Not exactly 0.2
My question is, when we ask for decimal value of theNumber
(EX: alert(theNumber)
), how does JavaScript runtime know theNumber
is originally 0.2?