I'm trying to implement a (pretty simple indeed) absolute deviation sorting algorithm in Javascript. Absolute deviation is defined as the absolute value of the difference between one element and the average of all elements. For example, given the elements 1, 4, 5 and 9, the average would be (1 + 4 + 5 + 9) / 4 = 4.75, and so the absolute deviation of each element would be calculated as follows:
- absDev(1) = |1 - 4.75| = 3.75
- absDev(4) = |4 - 4.75| = 0.75
- absDev(5) = |5 - 4.75| = 0.25
- absDev(9) = |9 - 4.75| = 4.25
Sorting the elements by ascending absolute deviance would hence give the sequence 5, 4, 1, 9. So far so good, my current Javascript implementation is giving me different results in different browsers.
Here it is: http://jsfiddle.net/WVvuu/
- In Firefox and Safari, I'm getting the expected result 5, 4, 1, 9
- In Chrome and Opera, I'm getting 4, 5, 1, 9
- In IE 10, I'm getting 1, 4, 5, 9
I guess there must be probably some very simple mistake in my code but I can't seem to find it. I'd like to understand what's wrong with it and why I'm getting a different result when I change my browser. I'd appreciate it if someone could kindly explain what I'm missing. Again, this is the code:
var array = [1, 4, 5, 9];
function absDev(x) {
return Math.abs(x - average(array));
}
function average(array) {
var sum = array.reduce(function(previousValue, currentValue) {
return previousValue + currentValue;
}, 0);
return sum / array.length;
}
array.sort(function(x, y) {
return absDev(x) - absDev(y);
});
alert("Sorted array: " + array);