0

Possible Duplicate:
Is JavaScript's Math broken?

I came across this rounding issue:

When I do this:

 .2  + .1 results in 0.30000000000000004
 .7  + .1 results in 0.7999999999999999
1.1  + .1 results in 1.2000000000000002

and so on...

Can anyone explain (in detail) why? Probably some binary rounding stuff. But I like to really know what happens...

Community
  • 1
  • 1
VDP
  • 6,340
  • 4
  • 31
  • 53

1 Answers1

5

In a nutshell, because .2 isn't actually .2; it's actually the closest representable double-precision number, which is

0.200000000000000011102230246251565404236316680908203125.

Similarly, .1 is really

0.1000000000000000055511151231257827021181583404541015625

When you add those together, the result is rounded again to the nearest representable number, which is

0.3000000000000000444089209850062616169452667236328125

Finally, when you print it out, that number gets rounded to 17 decimal digits, giving the result you observe.

Your other examples follow the same pattern.

Stephen Canon
  • 103,815
  • 19
  • 183
  • 269