This is due to the slightly odd way Javascript handles equality checks: you can check value with implicit type conversion with ==
or check type and value with ===
.
This also means that a list of 'falsish' values evaluate to false
if compared with ==
: ""
, 0
, undefined
, null
Now: "" == "0"
is comparing two strings that are obviously different.
The messy question is why does "0" == false
?
This is do do with the Javascript spec for ==
:
If the two operands are not of the same type, JavaScript converts the
operands then applies strict comparison. If either operand is a number
or a boolean, the operands are converted to numbers if possible; else
if either operand is a string, the other operand is converted to a
string if possible. If both operands are objects, then JavaScript
compares internal references which are equal when operands refer to
the same object in memory.
When you compare "0"
to a boolean it is first converted to a number, 0
, which is 'falsish'. When you compare "0"
to a string the strings are compared.
If you convert directly (without using ==
) then "0"
evaluates to true:
if("0") console.log('Now "0" is true!')