I am needing to increment any number, regardless of decimal places. The only custom function is decimalPlaces()
which goes like this:
function decimalPlaces(num) {
var match = (''+num).match(/(?:\.(\d+))?(?:[eE]([+-]?\d+))?$/);
if (!match) { return 0; }
return Math.max( 0, (match[1] ? match[1].length : 0) - (match[2] ? +match[2] : 0) );
}
So, then, for instance (in Node):
a = 2.11
> 2.11
places = decimalPlaces(a)
> 2
additive = Math.pow(10,places*-1);
> 0.01
a + additive
> 2.11999999999999997
It works for 90% of cases, but not this one. I know there is a way to use exponents to get in and out of this problem (multiply the number by 10^places and add 1), but I'm just curious--where is this weird floating point problem coming from in what appears to be some really simple math?