Why does 'Mystery!' <= 'Z'
equal true
but 'the' <= 'Z'
equal false
but both
'Mystery!' >= 'A'
and 'the' >= 'A'
equal true
. How does such comparision work?
Asked
Active
Viewed 61 times
0

UkoM
- 305
- 2
- 6
- 17
-
1Because `'B' < 'a'`. Characters are usually (on most systems) in this order: `..., A, B, ..., Z, ... , a, b, c, ...` – ibrahim mahrir Apr 21 '17 at 19:47
-
it depends on characters' codes. In fact what gets compared are the codes of the symbols. – curveball Apr 21 '17 at 19:47
-
Compare the lowercased version of both operands. – ibrahim mahrir Apr 21 '17 at 19:50
-
1@kindUser Just closing it without even reading the duplicate candidate. How typical. – ibrahim mahrir Apr 21 '17 at 19:55
-
@ibrahimmahrir The "on most systems" part is misleading. JavaScript always uses the UTF-16 encoding of the Unicode character set for strings. What might vary by source, user and time is the [locale](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Intl) but the binary comparison operators always use the UTF-16 lexicographic ordering. – Tom Blodget Apr 22 '17 at 20:20
2 Answers
3
it's comparing the UTF-16 code for the string value. Try the same comparisons with charCodeAt method to understand what's happening here
'y'.charCodeAt() <= 'Z'.charCodeAt()
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/charCodeAt

parallaxis
- 196
- 1
- 13
2
Upper case letters before lower case letters.
M
= ascii value 77
Z
= 90
77 < 90
t
= 116
Z
= 90
116 !< 90
See more here: www.asciitable.com

Zac
- 2,201
- 24
- 48
-
JavaScript doesn't use ASCII for built-in text datatypes. I'm not aware of any language that does. – Tom Blodget Apr 22 '17 at 18:05