I have some JavaScript that pulls dates out of two fields on my web page:
var StartDate = document.getElementById('StartDate');
var EndDate = document.getElementById('EndDate');
When I get these two dates I have the following snippet of code that performs the date subtraction:
var day = 1000*60*60*24;
var d1 = new Date(StartDate.value);
var d2 = new Date(EndDate.value);
var difference = (Math.ceil((d2.getTime() - d1.getTime()) / day))
Now is when the problem comes in. Say that my two dates are as such:
StartDate = 2013-05-01
EndDate = 2013-06-30
Using the calculator we get:
1372564800000 - 1367380800000 = 5184000000
5184000000 / 86400000 = 60 days
However, next let's use the following dates:
StartDate = 2013-10-01
EndDate = 2013-11-30
Again, using the calculator we get:
1385787600000 - 1380600000000 = 5187600000
5187600000 / 86400000 = 60.04166666666667 days
I'm just not sure how this is possible, I am using two identical date ranges. Both start days start on the first of a month with 31 days, and both end days end on the last day of a month with 30 days. When I put these date ranges into a MS Excel workbook I get the correct number of days:
=(EndCell-StartCell)
And I again get the 60 days for both sets of date ranges.
This seems to only happen when I cross into November of 2013. It doesn't happen when I cross into November of 2014, and I cannot find any other times when this happens. I know 2013 is gone, but my application will deal heavily with 2013 dates. Does anybody know of a reason why/how this is happening? Does anybody know of a better way to do date subtraction with JavaScript that will not cause this issue?