0

In console, of any web page, I do:

console.log(11111111111111111);

The result is: 11111111111111112

So I am really confused, trying to add 1s and see what happens:

  • 111111111111111111 -> 111111111111111100
  • 1111111111111111111 -> 1111111111111111200
  • 11111111111111111111 -> 11111111111111110000
  • 111111111111111111111 -> 111111111111111110000

Am I going crazy? What causes this to happen?

Is this intended? Is this a bug with Javascript?

enter image description here

Note: I am pretty sure it is not an integer overflow or something, because adding one 1 more goes to 1.1111111111111111e+21 which is correct.


This question differs from this one, because I am not asking about the limit of numbers that can be represented accurately, I am asking more like:

Why 11111111111111111 -> 11111111111111112 instead of 11111111111111111 -> 1.1111111111111111e+16 (scientific notation)

Community
  • 1
  • 1
Amit
  • 5,924
  • 7
  • 46
  • 94
  • It's because the number is too big and it becomes too big to represent accurately thus the discrepancy. – Andrew Li Apr 11 '17 at 22:51
  • @AndrewLi I understand it loses accuracy, but why does it at all do that? why show an incorrect thing instead of a correct number in other notation? – Amit Apr 11 '17 at 22:57

0 Answers0