I find myself needing to do some work with pretty small decimals, formatted a particular way, and javascript is doing weird things to them that I don't understand.
Each number enters the function formatted like this: 0.0000000000
. That's ten decimal places. Most of the time, that will mean trailing zeros, like this: 0.0011000000
.
Now, I wouldn't think this would matter, because before doing any other work with them, I turn them into integers, through the simple expedience of multiplying them by exactly 10000000000.
But when I do that, I sometimes get results that look like this:
var bigNum = 0.0000050000 * 10000000000; //Returns a value of 50000.00000000001
What the heck is going on here? Where is that extra 1 at the end coming from?