Found this here:
How does this even work? What is happening here? Why does the number change in the first line?
Found this here:
How does this even work? What is happening here? Why does the number change in the first line?
JavaScript uses double-precision floating-point format numbers as specified in IEEE 754 and can only safely represent numbers between -(253 - 1) and 253 - 1.
The number 111111111111111111 (18 digits) is above that range.
Reference: Number.MAX_SAFE_INTEGER
As mentioned above, JavaScript uses the double-precision 64-bit floating point format for the numbers. 52 bits are reserved for the values, 11 bits for the exponent and 1 bit for the plus/minus sign.
The whole deal with the numbers is beautifully explained in this video. Essentially, JavaScript uses a pointer that moves along the 52 bits to mark the floating point. Naturally, you need more bits to express larger numbers such as your 111111111111111111.
To convert your number into the binary, it would be
sign - 0
exponent - 10000110111
mantissa - 1000101010111110111101111000010001100000011100011100
The more space is taken by the value, the less is available for the decimal digits.
Eventually, simple calculations such as the increment by 1 will become inaccurate due to the lack of bits on the far right and the lowest possible increment will depend on the position of your pointer.