Here is what my friend noticed while working on a project. He does not have any stackoverflow account so I am asking on behalf of my friend.
var a = 1.75/3;
so it gives
a = 0.5833333333333334
Now when I add 1 to variable a I get this:
1.5833333333333335
Notice the difference in the last digits
Similarly when I do following
0.5833333333333336+1
I get
1.5833333333333335
Now replacing the 6
with 7
it gives me
1.5833333333333337
I can't understand what is going here. Can anyone please explain this?