-1

I want to show all dates between two days.I can't do this.Please help.. This is my code...

 var Day = 24*60*60*1000; // hours*minutes*seconds*milliseconds
var firstDate = new Date("2017-04-10");
var secondDate = new Date("2017-04-15");

var diffDays = (secondDate.getTime() - firstDate.getTime())/Day;
alert(diffDays);
for(var i=0;i<= diffDays ;i++)
{
   var d = new Date(firstDate  + i);
   alert(d);
}
T K
  • 17
  • 1
  • 6
  • 1
    Please look at your browser's console: it will display details of the syntax error and reference error in your code. Fix those and the code will run, though you shouldn't have a variable called `Date` given there is already a `Date` function, but in the code shown you don't actually do anything with the resulting dates - where should they be displayed? – nnnnnn Apr 10 '17 at 03:20
  • edit my code. But it show first date only not increment the date – T K Apr 10 '17 at 03:29
  • When in doubt what's happening, take baby steps. If you assign `firstDate + i` to a variable and inspect that, before you use the result value to create a new Date, the solution to your problem should be trivial. – migueldiab Apr 10 '17 at 04:07

2 Answers2

1

Your i variable is the number of the day to display. Your firstDate variable is a Date.

This line:

var d = new Date(firstDate + i);

adds these together and tries to create a new Date. Due to these being different types (a date and a number) type coercion comes into play (ie: the Date and the Number get converted to strings, and concatenated together).

Try this:

var DAY_IN_MS = 24 * 60 * 60 * 1000; // hours*minutes*seconds*milliseconds
var theDate = new Date("2017-04-10");
var i = 5;

// Date + Number = Type Coercion!  Both objects will be converted to strings and concatenated together.
alert(theDate + i);

// The Date constructor doesn't care, and will work because it can get a date from the first part of the string.
alert(new Date(theDate + i));

// If you use `getTime()` you can get the numerical value of the Date, which you can use for arithmetic.
alert(theDate.getTime());

// By adding `i`, however, you are only adding a few milliseconds to the first date !!
alert(new Date(theDate.getTime() + i);

// Number + Number of days times milliseconds per day = success!
alert(new Date(theDate.getTime() + (i * DAY_IN_MS)));

I think you mean:

var d = new Date(firstDate.getTime() + (i * Day));
Alex McMillan
  • 17,096
  • 12
  • 55
  • 88
  • i attached the above code but it not working.Show only first date – T K Apr 10 '17 at 03:38
  • Sorry I missed something. Try this. – Alex McMillan Apr 10 '17 at 03:43
  • Good explanation, however adding one to a date should be done [*thusly*](http://stackoverflow.com/questions/9989382/add-1-to-current-date) since not all days are 24hrs long where daylight saving is observed. ;-) – RobG Apr 10 '17 at 04:09
  • @RobG thanks - this is intended as a simple solution to OPs broken code, not a lecture on date-based arithmetic ;) Heaven forbid anybody mentions [timezones](https://www.youtube.com/watch?v=-5wpm-gesOY)... – Alex McMillan Apr 10 '17 at 23:00
0

Short answer, increment the right amount of milliseconds to firstDate.getTime() :

for(var i=0;i<= diffDays ;i++)
{
    firstDate = new Date(firstDate.getTime() + Day);
    console.log(firstDate);
}

Long answer, you have several things wrong in your code :

  1. Here var d = new Date(firstDate + i); you are adding to a Date object string representation the value of i in this case incremented on the loop.

  2. You are then parsing this String into a new Date object, but Javascript only recognizes the date and ignores the appended number

You should either try to get the milliseconds of firstDate as you did for diffDays and then add i * Day (also consider renaming this variable to something else, maybe a const DAY_IN_MS or something).

migueldiab
  • 82
  • 1
  • 4