Javascript is known to have 253 (9007199254740992, or 0x20000000000000) as the largest integer value for its Number
object, as discussed here. I still don't understand why Number(0x20000000000000)+1
produces 0x20000000000000
, but Number(0x20000000000000)+2
produces 0x20000000000002
(9007199254740994). Can someone please explain?
Asked
Active
Viewed 210 times
1
-
2Read this [Article](http://en.wikipedia.org/wiki/Double_precision_floating-point_format) – Moritz Roessler Sep 17 '13 at 08:56
-
Between 2^52=4,503,599,627,370,496 and 2^53=9,007,199,254,740,992 the representable numbers are exactly the integers. For the next range, from 2^53 to 2^54, everything is multiplied by 2, so the representable numbers are the even ones. – Moritz Roessler Sep 17 '13 at 08:59
-
Thanks, that makes perfect sense. I'd accept it as an answer if you post it. – noseratio Sep 17 '13 at 09:05
-
You're welcome =) I posted the quote as Answer – Moritz Roessler Sep 17 '13 at 09:18
1 Answers
3
Quoted from this Wikipedia article
Between 2^52=4,503,599,627,370,496 and 2^53=9,007,199,254,740,992 the representable numbers are exactly the integers. For the next range, from 2^53 to 2^54, everything is multiplied by 2, so the representable numbers are the even ones.

Moritz Roessler
- 8,542
- 26
- 51