1

I am aware of the rounding issues that programming languages have trying to represent a base 2 number in base 10. But there have been cases that baffle me when I was experimenting with the NodeJs console. I know for a fact that base 2 numbers cannot exactly represent 0.1 in base 10. But when I type 0.1 in the console, it returns back 0.1. At first I thought that it just returns a representation of the number back to the console. But when experimenting further, 0.1 + 0.1 returns 0.2. So here are my questions:

  1. How does the console know to round it exactly to 0.2?
  2. Why is it that I only experience the rounding errors when typing something like 0.2 + 0.1 as it prints out 0.3000000000004?
  3. What does Node store when we say something like x=0.1?

UPDATE: I think I wasn't clear with my question so I want to give a better example. 0.1 in base 10 is 0.000110011001100110011...... . This will get cut off at some point and lets just say that only 0.000110011001100110011 was stored. this is actually 0.099999904632568359375 in base 10. Why doesnt the console print out 0.099999904632568359375 when we type in 0.1. How is it able to get back that number past the rounding errors? Hope the source of my confusion is more understood.

alaboudi
  • 3,187
  • 4
  • 29
  • 47
  • Its not just with nodejs, however browsers also gives the same output :) – Ashvin777 Apr 14 '18 at 17:08
  • @Ashvin777 yes yes I know. but I want to know how theoretically unrepresentable numbers in base 2 are practicaly represent in base 10 accurately – alaboudi Apr 14 '18 at 17:10
  • Maybe you can take a look at this - https://stackoverflow.com/questions/3439040/why-does-adding-two-decimals-in-javascript-produce-a-wrong-result – Ashvin777 Apr 14 '18 at 17:13
  • @Ashvin777 I have realized I wasnt clear with my question. I have updated for you with an example. Hope this is clearer – alaboudi Apr 14 '18 at 17:28

1 Answers1

2

node.js is not showing you the exact values. The default for JavaScript conversion of Number to string is to use just enough decimal digits to distinguish the Number value from neighboring representable values. I do not know what method node.js uses, but simply using JavaScript’s default would explain it.

Eric Postpischil
  • 195,579
  • 13
  • 168
  • 312
  • Thank you for responding. Your answer is in line with my question! Thank you. I just had a question to further clarify what you've mentioned in your linked post. You said that JS increases precision till the value is distinguishable from it's neighbor. Why did JS round up to 0.1 from 0.099999046.. How does that last rounding step work? – alaboudi Apr 14 '18 at 22:56
  • Does it do a rounding from the cut off point? So after rounding up to 0.1, it still happens to be closest to the actual binary value than it's neighbors – alaboudi Apr 14 '18 at 23:00
  • @alaboudi: The software figures out what the shortest decimal sequence that satisfies the requirements is. The actual method it uses to do this is complicated. What it amounts to is that when printing the Number that `.1` becomes (0.1000000000000000055511151231257827021181583404541015625, or, in hexadecimal floating-point format, 0x1.999999999999ap-4), that number is printed as “.1”. – Eric Postpischil Apr 15 '18 at 00:23