0

I was reading the following abbreviated code exert from this post that gets the 'minimum' date in a list of dates:

var dates=[];
dates.push(new Date("2011/06/25"))
dates.push(new Date("2011/06/26"))
dates.push(new Date("2011/06/27"))
var minDate=new Date(Math.min.apply(null,dates));

Is someone able to explain why we need to use .apply here?

I understand that .apply is used to execute a function with a supplied a this value but I don't understand why the code requires the min function to be called with this=null and why the code does not work when you substitute: Math.min.apply(null, dates) for Math.min(dates)

Grant
  • 438
  • 2
  • 13

2 Answers2

2

It's because the Function.prototype.apply() method has two parameters. First is the newly assigned this value and the second is an array of arguments that will be passed to the called function.

Your example does not do anything special with the first parameter but it does with the second.

Math.min() accepts an infinite amount of arguments. Your examples passes the array of dates as an argument and spreads the items in the array as arguments. Nowadays you could do something like this, with the spread syntax.

var minDate = new Date(Math.min(...dates));
Emiel Zuurbier
  • 19,095
  • 3
  • 17
  • 32
1

Math.min requires one or more values passed as parameters. Since the date strings have been pushed into an array, passing an array will just use the array as a value, not the dates.

Using apply will pass the elements of the array as arguments, it's effectively the same as using spread syntax:

Math.min(...dates);

Since the first argument passed to apply is the value to use for this, and Math.apply doesn't care what this is, null is passed (you could pass any value).

This technique was common before spread syntax was introduced in ECMAScript 2016 (ed 7).

RobG
  • 142,382
  • 31
  • 172
  • 209