Why console.log(10 < 14 < 50) is true but console.log(10 > 7 > 2) is false? why does first give true and second gives false
Asked
Active
Viewed 578 times
-1
-
6Because `10 < 14 < 50` = `1 < 50` while `10 > 7 > 2` = `1 > 2` – Reyno Nov 30 '21 at 15:11
-
[Operator Precedence](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Operator_Precedence#table) is a good resource to be familiar with – David784 Nov 30 '21 at 15:14
-
Alternative duplicate: [Why does (0 < 5 < 3) return true?](https://stackoverflow.com/questions/4089284/why-does-0-5-3-return-true?noredirect=1&lq=1) – Ivar Nov 30 '21 at 15:19
2 Answers
4
It works that way:
First it calculate the left side (10 < 14)
=> it returns True
, which is also 1
.
Then it calculates the right side: (True < 50)
which is (1 < 50)
and this is also True
.
On the other side, (10 > 7)
=> True
= 1
, but (True > 2)
which is (1 > 2)
is False
.
If you'll try: console.log(10 < 14 == 1)
you will see that it's also True
.

Kfir Ram
- 324
- 1
- 4
2
This is not how you would check an interval in JS.
Your code would be interpreted from left to right according to operator precedence. The comparison operators are binary operators, they take a left-hand expression and a right-hand expression
10 < 15 < 50
// To
true < 50
// To
1 < 50
// And
10 > 7 > 2
// To
true > 2
// To
1 > 2
You should do:
console.log(14 < 10 && 14 <50)
//
console.log(7 < 10 && 7 > 2)

thchp
- 2,013
- 1
- 18
- 33