1

Javascript is known to have 253 (9007199254740992, or 0x20000000000000) as the largest integer value for its Number object, as discussed here. I still don't understand why Number(0x20000000000000)+1 produces 0x20000000000000, but Number(0x20000000000000)+2 produces 0x20000000000002 (9007199254740994). Can someone please explain?

Community
  • 1
  • 1
noseratio
  • 59,932
  • 34
  • 208
  • 486

1 Answers1

3

Quoted from this Wikipedia article

Between 2^52=4,503,599,627,370,496 and 2^53=9,007,199,254,740,992 the representable numbers are exactly the integers. For the next range, from 2^53 to 2^54, everything is multiplied by 2, so the representable numbers are the even ones.

Moritz Roessler
  • 8,542
  • 26
  • 51