timedLoop(10000, function() {
health=health-0.1;
updatecounters();
console.log(health);
});
Health is initially set to 83.3 and console.log prints out the following statements:
83.2
83.10000000000001
83.00000000000001
82.90000000000002
82.80000000000003
82.70000000000003
82.60000000000004
82.50000000000004
82.40000000000005
It's supposed to be decreasing by 0.1 every 10 seconds, but there seems to be a floating point error. I know that I can fix this by simply rounding the variable to 2 decimal places, but why does this happen?
I think that this is just shitty compiling. If a number has a terminating decimal (less than 32 digits) or is rational, the program should be able to store the exact value instead of making shenanigans like these happen. I fully understand why multiplication and division induce floating point errors, but addition and subtraction should not cause bugs like these.