Possible Duplicate:
Is JavaScript's Math broken?
I came across this rounding issue:
When I do this:
.2 + .1 results in 0.30000000000000004
.7 + .1 results in 0.7999999999999999
1.1 + .1 results in 1.2000000000000002
and so on...
Can anyone explain (in detail) why? Probably some binary rounding stuff. But I like to really know what happens...