I just read on MDN that one of the quirks of JS's handling of numbers due to everything being "double-precision 64-bit format IEEE 754 values" is that when you do something like .2 + .1
you get 0.30000000000000004
(that's what the article reads, but I get 0.29999999999999993
in Firefox). Therefore:
(.2 + .1) * 10 == 3
evaluates to false
.
This seems like it would be very problematic. So what can be done to avoid bugs due to the imprecise decimal calculations in JS?
I've noticed that if you do 1.2 + 1.1
you get the right answer. So should you just avoid any kind of math that involves values less than 1? Because that seems very impractical. Are there any other dangers to doing math in JS?
Edit:
I understand that many decimal fractions can't be stored as binary, but the way most other languages I've encountered appear to deal with the error (like JS handles numbers greater than 1) seems more intuitive, so I'm not used to this, which is why I want to see how other programmers deal with these calculations.