In javascript, numbers are represented internally as double-precision floats, meaning there are 53 bits available to represent integer values. There's a Number.MAX_SAFE_INTEGER
constant that illustrates this, which is equal to Math.pow(2, 53) - 1
. However, in the javascript console, I can type Number.MAX_SAFE_INTEGER + 20
and it will spit out the correct integer value.
How does javascript represent numbers greater than Number.MAX_SAFE_INTEGER
internally?