I came across with a question about apply when using it with e.g. Math.max. For example let's say I have an array:
var arr = [1, 2, 3, 4, 5];
var biggest = Math.max.apply(Math, arr);
console.log(biggest);//outputs 5 which is correct
But whatever value I passed as first argument I always get the same result:
var biggest = Math.max.apply(this, arr);
var biggest = Math.max.apply(null, arr);
var biggest = Math.max.apply("", arr);
var biggest = Math.max.apply(window, arr);
...
console.log(biggest);//all above output 5 why??
The only assumption I can make is that the Math.max when is been called throw apply the function context doesn't matter in this situation?