In Chrome, when the string doesn't contain year, why does JavaScript default to 2001?
t = new Date('Monday, Jan 11')
Thu Jan 11 2001 00:00:00 GMT+0000 (Greenwich Mean Time)
Edit: fwiw, this behavior is also seen in React Native
In Chrome, when the string doesn't contain year, why does JavaScript default to 2001?
t = new Date('Monday, Jan 11')
Thu Jan 11 2001 00:00:00 GMT+0000 (Greenwich Mean Time)
Edit: fwiw, this behavior is also seen in React Native
I found this a fun question so I went digging into the source code of the V8 engine, the javascript engine used by Chromium and therefore Chrome (the browser this phenomena shows up in) to reverse engineer this strange behavior.
I found this snippet in dateparser.h:
if (!is_iso_date_) {
if (Between(year, 0, 49))
year += 2000;
else if (Between(year, 50, 99))
year += 1900;
}
Assuming that a date without a year is not ISO compliant, 2000
or 1900
is added to the OPs undefined year
. This should suggest there is a breakpoint above 50
, which turns out to be true:
// Tested in Chrome:
console.log(new Date('1 1 49').getFullYear()); // 2049
console.log(new Date('1 1 50').getFullYear()); // 1950
Higher up in the file year
is defined like this:
int year = 0; // Default year is 0 (=> 2000) for KJS compatibility.
KJS being a very early javascript engine that was adopted by WebKit, which in turn was forked into Chromium, so this default value seems to be chosen for unknown compatiblity reasons with ancient ancestor engines. Following this logic though, the year should default to 2000
, not 2001
. Turns out that might have been the intention.
In the same document, this snippet occurs:
// Day and month defaults to 1.
while (index_ < kSize) {
comp_[index_++] = 1;
}
comp_
holds the default values for [day, month, year]
. Judging by the comment, kSize
should be 3
so only day
and month
would be set to 1
if not specified otherwise.
However, kSize
is 4
so the loop iterates over the year
part as well. year
is then (if not specified) set to 1
, overwriting the int year = 0
statement from before. The aforementioned conditional kicks in, year += 2000
happens and you end up with the odd default value of 2001
.
I suspect this is a bug, but can't be sure as the intended behavior of parsing a nonexistent date is hard to define. You should probably be better of specifying your own default year anyway. And there ends this little speculative journey through the far away source code of the V8 engine.
The constructor for Date if passed a string uses Date.parse() which:
Excerpt:
The ECMAScript specification states: If the String does not conform to the standard format the function may fall back to any implementation–specific heuristics or implementation–specific parsing algorithm. Unrecognizable strings or dates containing illegal element values in ISO formatted strings shall cause Date.parse() to return NaN.
However, invalid values in date strings not recognized as simplified ISO format as defined by ECMA-262 may or may not result in NaN, depending on the browser and values provided, e.g.:
// Non-ISO string with invalid date values
new Date('23/25/2014');
will be treated as a local date of 25 November, 2015 in Firefox 30 and an invalid date in Safari 7.
So the answer is a combination of, because the browser you executed it decided that, and it's not a standard so a browser may return just about anything.
This is done by your browser, you are probably using Crome. With other browsers you can also get Invalid Date.