I just noticed the strangest problem with simple addition in javascript.
I have a simple equation that I am doing:
-1.000+(1.001) and the answer should be .001 however I am getting 0.0009999999999998899 instead. I can't understand why this is happening and there doesn't seem to be any way to get the right answer. I've check with multiple calculators and they all give me .001 but javascript gives me this crazy number.
What is going on? This isn't right. The number is very close but it is completely wrong. I've tried with other decimals values and I get strange results. How can javascript not do math properly?
Here is the simple alert box I used:
alert(1.000-(1.001));