So I'm playing around JavaScript's Date object, and I ran into something I think is a little strange.
I'm trying to figure out how many days there is between 2 given dates, and for that I use the formula below:
var oneDay = 24*60*60*1000;
var diffDays = Math.round(Math.abs((firstDate.getTime() - secondDate.getTime())/(oneDay)));
If you take 2017-05-28 & 2017-05-30 it returns 2 days - as it should
var oneDay = 24*60*60*1000;
var firstDate = new Date(2017, 05, 28);
var secondDate = new Date(2017, 05, 30);
var diffDays = Math.round(Math.abs((firstDate.getTime() - secondDate.getTime())/(oneDay)));
If you take 2017-05-30 & 2017-06-01 it returns 1 days - supposed to be 2 days
var oneDay = 24*60*60*1000;
var firstDate = new Date(2017, 05, 28);
var secondDate = new Date(2017, 05, 30);
var diffDays = Math.round(Math.abs((firstDate.getTime() - secondDate.getTime())/(oneDay)));
If you take 2017-05-30 & 2017-06-01 it returns 3 days - supposed to be 2 days
var oneDay = 24*60*60*1000;
var firstDate = new Date(2017, 11, 29);
var secondDate = new Date(2017, 12, 01);
var diffDays = Math.round(Math.abs((firstDate.getTime() - secondDate.getTime())/(oneDay)));