When I create the following date variable:
var d = new Date('2018-01-01')
and run
d.getDate()
I get 31. when I run
d.getMonth()
I get 11.
Why do I get 31 and 11 rather than 1 and 1?
When I create the following date variable:
var d = new Date('2018-01-01')
and run
d.getDate()
I get 31. when I run
d.getMonth()
I get 11.
Why do I get 31 and 11 rather than 1 and 1?
If you live in the western hemisphere, this is a timezone issue, I believe.
Javascript timers are based on Unix ticks(basically seconds since January 1, 1970). Since date-only format strings using ISO 8601 are treated as UTC, Javascript mitigates it by adding your local timezone, thus adjusting your time.
Example: 2018-01-01 00:00:00 - your timezone = 2017-12-31 {your timezone offset}.
You can recreate this by moving your timezone to negative UTC, try creating a date using the new Date('string')
method, then getting the date.
You can fix this by using the new Date(year, monthIndex [, day [, hours [, minutes [, seconds [, milliseconds]]]]]);
method instead.
Here's a better understanding to it.
The months are 0 indexed, meaning they start from 0 and go up to 11.