There is more than one stackoverflow question about how to find the min or max of an array of values in javascript. This is not that question.
I want to know why passing .apply()
strange things as the this
argument still works. Despite a good blog post from Aaron Crane on how the Math object's API became what it is, there's still something left unanswered.
Each of the following code snippets work. My question is, how? What exactly is happening with the assignment to this
that makes each of these work?
The Standard Construction
var values = [45, 46, 47]
var min = Math.min.apply(Math, values);
alert(min); //45
A Weirder Construction, But Scope can be Tricky...
var values = [45, 46, 47]
var min = Math.min.apply(this, values);
alert(min); //45
Slightly Weirder
var values = [45, 46, 47]
var min = Math.min.apply(global, values);
alert(min); //45
Weirder Still, But Maybe Okay b/c Browsers
var values = [45, 46, 47]
var min = Math.min.apply(window, values);
alert(min); //45
Very Weird
var values = [45, 46, 47]
var min = Math.min.apply(null, values);
alert(min); //45
Truly Weird
var values = [45, 46, 47]
var min = Math.min.apply(undefined, values);
alert(min); //45