Edit After spending several hours on this and working with @pst, it turns out the issue was completely different.
Inside the code, you can see I used a timestamp shortcut of "+new Date()". This returns a timestamp as does the standard "new Date().getTime()".
However, +new Date() performs very, very badly when used with mathematical operations (+,-,/). Although the typeof() for the 'start' variable says 'number', something happens that makes it bloody slow. When using the standard getTime() method, there is no performance penalty when doing the timing subtractions.
Take a look at this jsperf detailing the problem, http://jsperf.com/new-date-timing.
In regards to @pst's very detailed answer and the efforts I made to replicate the linked question, use his answer as the canonical answer to that question.
I am going to change the title of this question to accurately reflect @pst's answer and my original intent, but will leave the original title and question for future reference.
New Question
Do javascript arrays utilize branch prediction on arrays of random sorted and unsorted data?
See @pst's answer below.
Original Title and Question below
Title: Array iterations taking 2x as long on the same data
I was looking at this question earlier, Why is it faster to process a sorted array than an unsorted array?, and wanted to try setting up the same test in javascript.
This has lead me to something unexpected. In the tests linked in the following fiddle, simply iterating over the same array of randomly generated number values with the same code results in vastly different response times.
I've tested this in Chrome, Safari, Firefox, and node.js.
In Chrome & node, the first iteration is faster than the 2nd iteration. In Safari and Firefox, the first iteration is slower than the 2nd iteration.
Here's the fiddle, http://jsfiddle.net/9QbWB/6/
In the linked fiddle, I've disabled sorting (thinking that was originally the issue, but it wasn't). Sorting the data made the looping even longer.
I've gone through the code pretty thoroughly to make sure I've removed anything that could be affecting the results. I feel like a particular set of scientists announcing FTL neutrinos, where I can't find a problem in my experiment and the data is unexpected.
In the code I'm including below, a few things like initial setTimeout, jQuery DOM visualizations, etc are for displaying data visually in jsfiddle.net. The core functions are the same.
//javascript test of https://stackoverflow.com/questions/11227809/why-is-processing-a-sorted-array-faster-than-an-unsorted-array
setTimeout(function(){
var unsorted_list = $('#unsorted'),
sorted_list = $('#sorted');
var length = 32768,
data = [];
for(var i=0; i<length; i++){
data[i] = Math.floor( Math.random()*256);
}
var test = function(){
var sum = 0,
start = +new Date();
for(var i=0; i<1000; i++){
for(var c=0; c<length; c++){
if( data[c] >= 128 ){
//sum += data[c];
}
}
}
return +new Date() - start;
}
//Unsorted
var results = 0;
for( var i=0; i<10; i++){
var x = test();
console.log(x);
unsorted_list.append('<div>'+x+'</div>');
results += x;
}
unsorted_list.append('<div>Final:'+(results/10)+'</div>');
console.log( 'Unsorted: ', results/10 );
//Sort array
//data.sort();
//Sorted
var results = 0;
for( var i=0; i<10; i++){
var x = test();
console.log(x);
sorted_list.append('<div>'+x+'</div>');
results += x;
}
sorted_list.append('<div>Final:'+(results/10)+'</div>');
console.log( 'Sorted: ', results/10 );
},5000);