"Because it's easier"
I'm pretty sure the behavior for double equals was designed to lower the bar for people learning the language. In nearly every case, there's no good reason to use double equals over triple except to save a bit of typing.
Triple is faster and safer. It doesn't do any "magic". Magic sucks. Magic is less predictable.
That said, how double equals works is well known, and isn't doing that much magic. It's simply casting to the nearest primitive type, and you might consider one use case for it to be comparing an input string with a number value.
e.g., quantityInput.value == 0
An obvious issue with this would in cases where quantityInput is empty or not a number. But, this issue would arise with triple equals, too. So, ignoring those cases, the same thing using triple-equals would be:
Number(quantityInput.value) === 0
or +(quantityInput.value) === 0
The intentions of the triple equal examples are more clear. We're taking this thing (quantityInput.value
), turning it into a number, and comparing it to another number. Of course it's more verbose, but in the last example, we're only up by 4 characters.
Magic is bad, mkay?
Double equals becomes less predictable when you start comparing things that might not be primitives.
Take this example:
var a = 'aString';
var b = {
toString: function() {
return 'aString';
}
};
a == b; // true
You may have been expecting that for the double equals, the compare would invoke b
's toString
to compare a
's string value, and you were right.
But what about this?:
var a = 'aString';
var b = {
valueOf: function() {
return 'aDifferentString';
},
toString: function() {
return 'aString';
}
};
a == b; // 'aString' == 'aDifferentString' // false
valueOf
trampled toString
. Double equals doesn't care what types you're comparing. If it sees a primitive being compared with an object, it'll try for valueOf
first, and toString
second.
What about when a
is the number 0
, and b.valueOf
returns a '0'
string?:
var a = 0;
var b = {
toString: function() {
return '0';
}
};
a == b; // 0 == '0' // 0 === 0 // true
The double equals used b.toString
which provided a String
, and then casted that String
to a Number
.
Now what if a
and b
are both objects that return the same thing in valueOf
and toString
?
var a = {
valueOf: function() {
return 'aValue';
},
toString: function() {
return 'aString';
}
};
var b = {
valueOf: function() {
return 'aValue';
},
toString: function() {
return 'aString';
}
};
a == b; // false
Were you expecting that? If not, this should serve as a good example as to why magic is bad. You never know how powerful the magic is going to be.
In this case, because neither a
nor b
were primitives, it skipped the valueOf
and toString
calls, and simply compared pointers.
EDIT: I'll add one "useful" case where a former co-worker violated his own "always use triple equals" rule from the project code spec that he drafted.
He'd check for undefined
using potentiallyUndefinedThing == null
. His argument was that it should get special treatment because double equals equates null to undefined and it kills two birds with one stone (undefined
& null
checks). In reality, nearly every case he used it, we all knew that the value would either never be null
or never be undefined
. Just because you see a tweet suggesting a use of ==
doesn't make the magic kosher. It's still just laziness.
It didn't bother me too much when I reviewed his code, but when he sent back my explicit === undefined
checks to be replaced with == null
checks, I cringed.