0

I had a bug in my code where I was comparing strings instead of numbers.

I was doing "100" < "5" and it was returning true.

Why does javascript think that "100" is less than "5"?

Kevin Amiranoff
  • 13,440
  • 11
  • 59
  • 90
  • 1
    Because 1 comes before 5. It's the same reason "Aaron" comes before "Betelgeuse": lexicographical sorting. – Ingo Bürk Nov 24 '19 at 12:26

1 Answers1

5

When you use < with strings, the code points of the each index of the strings are compared. The code point for 1 is 49, and the code point for 5 is 53, so '100' < '5', because 49 < 53.

console.log(
  '1'.charCodeAt(),
  '5'.charCodeAt()
);

Similarly, 'A' < 'a' because the code point for A (65) is smaller than the code point for a (97).

Andy
  • 61,948
  • 13
  • 68
  • 95
CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
  • By "code point" you mean "ascii code" I guess? – Andy Nov 24 '19 at 13:06
  • 1
    https://en.wikipedia.org/wiki/Code_point The ASCII range is relatively small, I'm pretty sure the logic applies to *any* Javascript string (which can have characters anywhere in the unicode range IIRC) – CertainPerformance Nov 24 '19 at 13:09