I had a bug in my code where I was comparing strings instead of numbers.
I was doing "100" < "5" and it was returning true.
Why does javascript think that "100" is less than "5"?
I had a bug in my code where I was comparing strings instead of numbers.
I was doing "100" < "5" and it was returning true.
Why does javascript think that "100" is less than "5"?
When you use <
with strings, the code points of the each index of the strings are compared. The code point for 1 is 49, and the code point for 5 is 53, so '100' < '5'
, because 49 < 53.
console.log(
'1'.charCodeAt(),
'5'.charCodeAt()
);
Similarly, 'A' < 'a'
because the code point for A
(65) is smaller than the code point for a
(97).