1

Here is what my friend noticed while working on a project. He does not have any stackoverflow account so I am asking on behalf of my friend.

var a = 1.75/3;

so it gives

a = 0.5833333333333334

Now when I add 1 to variable a I get this:

1.5833333333333335

Notice the difference in the last digits

Similarly when I do following

    0.5833333333333336+1

I get

    1.5833333333333335

Now replacing the 6 with 7

it gives me

1.5833333333333337

I can't understand what is going here. Can anyone please explain this?

me_digvijay
  • 5,374
  • 9
  • 46
  • 83
  • This should help http://floating-point-gui.de/ – elclanrs May 10 '13 at 08:40
  • It's an issue in Javascript with floating point numbers. http://stackoverflow.com/questions/1458633/elegant-workaround-for-javascript-floating-point-number-problem – Scott May 10 '13 at 08:40
  • It's a common question here for JS. [Try searching for it](http://stackoverflow.com/search?q=%5BJavaScript%5D+decimal+arithmetic). – Joseph May 10 '13 at 08:41
  • Check this http://stackoverflow.com/questions/2480699/understanding-floating-point-variables – 999k May 10 '13 at 08:41
  • Check out [http://stackoverflow.com/questions/588004/is-javascripts-floating-point-math-broken](http://stackoverflow.com/questions/588004/is-javascripts-floating-point-math-broken). – devnull May 10 '13 at 08:57

0 Answers0