I was bored, so I started fidlling around in the console, and stumbled onto this (ignore the syntax error):
That raises a few questions for me:
- How in accurate is Javascript? Has this been determined? I.e. a number that can be taken into account?
- Is there a way to fix this? I.e. to do math in Javascript with complete accuracy (within the limitations of its datatype).
- Should the changed number after the second operation be interpreted as 'changing back to the original number' or 'changing again, because of the inaccuracy'?
I'm not sure whether this should be a separate question, but I was actually trying to round numbers to a certain amount after the decimal point. I've researched it a bit, and have found two methods:
> Method A
function roundNumber(number, digits) {
var multiple = Math.pow(10, digits);
return Math.floor(number * multiple) / multiple;
}
> Method B
function roundNumber(number, digits) {
return Number(number.toFixed(digits));
}
Intuitively I like method B more (looks more efficient), but I don't know what going on behind the scenes so I can't really judge. Anyone have an idea on that? Or a way to benchmark this? And why is there no native round_to_this_many_decimals function? (one that returns an integer, not a string)