For example:
var one = ['H', 'i'];
var two = ['H', 'i'];
(one == two) returns false
but
(one.join('') == two.join('')) returns true
Why is that?
For example:
var one = ['H', 'i'];
var two = ['H', 'i'];
(one == two) returns false
but
(one.join('') == two.join('')) returns true
Why is that?
There is a difference on how equality is defined for strings and arrays - strings are considered equal if their contents are identical, but arrays are considered equal only if it's the same array, and different otherwise even if their contents match.
There are a bunch of reasons why it could be the way it is, for example two reasons:
1) you often don't want array comparison to go through the whole array, because it could be huge and would take a huge time. So the default way shouldn't be dangerous.
2) you can alter array contents while still being 'the same' array; while javascript strings are immutable so any changed string is a new, different object.
When comparing objects, JS wants to see if they are the actual same object, not just an object with the same contents.
I find underscore's isEqual method useful here, but if you want to figure out how it is done library free, just glance at underscores core, which is very easy to read