I declared two variables like this and tried to compare them like this
var x = 9999999999999999;
var y = 10000000000000000;
x == y //returns true
What is the reason for this behavior?
I declared two variables like this and tried to compare them like this
var x = 9999999999999999;
var y = 10000000000000000;
x == y //returns true
What is the reason for this behavior?
Check this out:
"Javascript doesn't have integers, only 64-bit floats - and you've ran out of floating-point precision."
Why is 9999999999999999 converted to 10000000000000000 in JavaScript?
Thanks to the member Kos.