4

Hi im trying to subtract 2 decimal numbers and it keeps returning some weird number.

var x = 0.00085022
var y = 0.00085050
var answer = x - y
alert(answer)

This is the number its returning -2.8000000000007186e-7

iMacroPro
  • 86
  • 1
  • 7
  • Possible duplicate of [Format number to always show 2 decimal places](http://stackoverflow.com/questions/6134039/format-number-to-always-show-2-decimal-places) – Tommy Lee Aug 03 '16 at 01:49
  • Please have a look at here [How to deal with floating point number precision in JavaScript?](http://stackoverflow.com/questions/1458633/how-to-deal-with-floating-point-number-precision-in-javascript) – choz Aug 03 '16 at 01:54
  • http://www.2ality.com/2012/03/displaying-numbers.html – Sam Axe Aug 03 '16 at 01:55

2 Answers2

3

The maximum number of decimals is 17, but floating point arithmetic is not always 100% accurate http://www.w3schools.com/js/js_numbers.asp

Try this:

var x = 0.00085022 * 100000000;
var y = 0.00085050 * 100000000;
var answer = (x - y) / 100000000;
alert(answer);
niklassc
  • 597
  • 1
  • 9
  • 24
  • 4
    Keep in mind that instead of dividing by `100000000` you can multiply by `0.000000001`, which is a less expensive calculation. But as always, for the sake of code clarity, this improvement should not be a concern until performance becomes a problem. – Tiago Marinho Aug 03 '16 at 02:04
1

You are subtracting with a higher number and the calculations are traversing to an even lower number. Yes -2.0 is lower and the decimal places precision is reaching exponentially higher.

If we round them up we get:

var x = 85022
var y = 85050
var answer = x - y
alert(answer); // = -28
Leroy Thompson
  • 470
  • 3
  • 13