To make it short:
NEVER use for in
on arrays, its extremely slow and prone to failure.
Explanation
Well, although the Array
in JavaScript is an Object
, there's no good reason to use the for in
loop in order to iterate over it. In fact there a number of very good reasons against the use of for in
on an Array
.
While it may seem like a good choice at first, to trade the some speed against the readability of the for in
construct, this has major implications on performance.
The for in
does in fact iterate over the indexes of an Array
. But it does also traverse the prototype chain. So one already has to use hasOwnProperty
in order to make sure to filter out unwanted properties, and still if any additional properties happen to be defined on the array, they will still make it through this filter.
Array.prototype.bar = 1; // poisoning the Object.prototype, NEVER do this
var foo = [1, 2, 3];
for(var i in foo) {
console.log(i);
}
The above code results in "indexes" 0
, 1
, 2
and bar
being printed out.
Using hasOwnProperty for filtering
Array.prototype.bar = 1; // poisoning the Object.prototype, NEVER do this
var foo = [1, 2, 3];
foo.blub = 2;
for(var i in foo) {
if (foo.hasOwnProperty(i)) {
console.log(i);
}
}
The above code results now prints the"indexes" 0
, 1
, 2
and blub
, you cannot filter out the blub
in any meaningful way unless you validate the key being an positive integer.
Performance
Now, combining the already slow nature of the prototype traversing for in
with the use of hasOwnProperty
results in a performance degradation of a factor of up to 20x.
So if you want to iterate over an Array
in JavaScript, always use the classic for
loop construct.
var list = [1, 2, 3, 4, 5, ...... 100000000];
for(var i = 0, l = list.length; i < l; i++) {
console.log(list[i]);
}
As you can see, there's one extra catch in the above example. That is the caching of the length via l = list.length
.
While the length
property is defined on the array itself, there's still an overhead for doing the lookup on each iteration. And while recent JavaScript engines may apply optimization in this case, one can never be sure that those optimizations are actually in place, nor can one be sure whether they reach the speed of the above caching. In fact leaving out the caching may result in a performance degradation of a factor of up to 2x (and even more in older engines).