I'm trying to come up with a solution to avoid decimal errors when doing calculations in my application. I've seen some solutions but still have questions. My understanding is that floating point numbers can't be accurately stored in Javascript, numbers get a non 0 digit in their trails when they should be 0. Naughty digits, or digits that aren't noughty :) anyway...
eg. 0.1 + 0.2 = 0.xxxxxxx4
First question, would a solution be to not allow numbers to have > x decimal digits in any numbers as Javascript can't handle it? Limit the precision.
Next question, I have to compute strings like this: '2 + 2 + 3 * 0.1 + 0.2 * 10^5'
how can I make sure it gets an accurate result? I was just going to use Mathjs but it won't avoid floats being wrong, calculations would be wrong wouldn't they? There are the solutions that use rounding that trims the trailing digits but how would I do that here? I'd have to write my own calculator that does rounding on every pair of numbers?
How can I decide what amount of decimal digits is acceptable? I'd guess the type of application is important, would 10 decimal digits be enough for most calculations, eg. finance? How many digits do calculators use when doing their calculations, is it just the amount you can see on the screen?
Next question, why doesn't Javascript limit the amount of decimal digits if it is going to have digits become numbers they aren't suppose to be? It's seems like 1 + 1 = 2....sorry no it doesn't. It's confusing.