While working on measurement.js, the joys of TDD helped me stumble upon a really strange (at least as i occurs to me) behaviour exposed in the javascript engines.
No matter if entered into the console or performed inside a script, this is what happens:
-1 + 0.85 --> -0.85 ✓
-1 + 1 --> 0 ✓
-1 + 1 + -.15 --> 0.15 ✓
-1 + 1.15 --> 0.1499999999999999 ?!?
This is tested and reproduced exactly under following Browsers / OS's:
- FF 24.0 (Debian 3.10)
- Chrome 30.0.1599.114 (Debian 3.10)
- Chrome 30.0.1599.101m (Win7SP1)
- Internet Explorer 10.0.9200.16721 (Win7SP1)
As this is consistent throughout different vendors, i assume there must be a specific reason for this, so:
- What is the reason for this?
- What is the best practice to circumvent this behaviour, as it poses a problem for exact calculations with JS
Update:
Best lay-comprehensible explanation incl. answers and workarounds for multiple programming languages so far found at
http://floating-point-gui.de/ (thanx @RocketHazmat)