I am aware of the rounding issues that programming languages have trying to represent a base 2 number in base 10. But there have been cases that baffle me when I was experimenting with the NodeJs console. I know for a fact that base 2 numbers cannot exactly represent 0.1
in base 10. But when I type 0.1
in the console, it returns back 0.1
. At first I thought that it just returns a representation of the number back to the console. But when experimenting further, 0.1 + 0.1
returns 0.2
. So here are my questions:
- How does the console know to round it exactly to
0.2
? - Why is it that I only experience the rounding errors when typing something like
0.2 + 0.1
as it prints out0.3000000000004
? - What does Node store when we say something like
x=0.1
?
UPDATE: I think I wasn't clear with my question so I want to give a better example. 0.1
in base 10 is 0.000110011001100110011......
. This will get cut off at some point and lets just say that only 0.000110011001100110011
was stored. this is actually 0.099999904632568359375
in base 10. Why doesnt the console print out 0.099999904632568359375
when we type in 0.1
. How is it able to get back that number past the rounding errors? Hope the source of my confusion is more understood.