16

In C#, a date can have a min value which is 1/1/0001 12:00:00 AM. This is nice in validation situations.

I tried producing this value in JavaScript but it's not working:

var minDate = new Date("1/1/0001 12:00:00");

If I convert this to toISOString(), I'm getting "2001-01-01T18:00:00.000Z".

My API backend in C# is expecting a date value and I don't want to arbitrarily create my own version of DateTime.MinValue in JavaScript. Is there a way for me to produce this value in JavaScript?

Sam
  • 26,817
  • 58
  • 206
  • 383
  • 1
    `new Date(0)` returns 1/1/1970 12:00:00 in UTC time, which is a pretty acceptable standard for a "minimum time". – Patrick Roberts Mar 26 '18 at 05:06
  • 1
    Yes but that's what I meant by my arbitrary value. C# will treat it like Jan 1st, 1970, not `DateTime.MinValue`. I then need to add logic in my backend API to treat this value as `DateTime.MinValue`. I guess, there's no way to get the C# value in JavaScript? – Sam Mar 26 '18 at 05:08
  • JavaScript doesn't have a `default()` operator, so there's not always going to be a one-to-one equivalency between languages. I'm not sure of the details for your validation script, but just asserting that the input value is a valid date within a valid range should be sufficient – Patrick Roberts Mar 26 '18 at 05:11

4 Answers4

15

When a string passed into a Date constructor is not in extended ISO format the parsing is completely implementation dependant. If it is in ISO format the parsing is well-defined and it will work exactly the way you want:

var myDate = new Date('0001-01-01T00:00:00Z');
console.log( myDate.getUTCFullYear() );
Paul
  • 139,544
  • 27
  • 275
  • 264
  • I tried that but when I run `console.log(myDate.getYear());` with that value, I get `-1899`. I need to check and see if my backend accepts it as `0001-01-01` – Sam Mar 26 '18 at 05:11
  • 1
    @Sam That's an unrelated issue. `getYear` is [nonstandard and doesn't do what you expect](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/getYear). You should use `myDate.getFullYear()` instead. – Paul Mar 26 '18 at 05:14
  • Also `new Date(-621355968e5)` or just `new Date('0001-01-01')` in most browsers – Slai Mar 26 '18 at 05:17
  • 1
    @Sam You may also want `getUTCFullYear()` if you want to make sure you get `1` and not `0` in timezones that are behind UTC. – Paul Mar 26 '18 at 05:20
2

'1/1/0001' is not the minimum date in JavaScript, -8640000000000000 is the minimum timestamp available in JavaScript, so the minimum date is

new Date(-8640000000000000)

which is Tuesday, April 20th, 271,821 BCE

see here for more details: Minimum and maximum date

pandazyque
  • 21
  • 1
-1

From MDN Date doc:

year: Integer value representing the year. Values from 0 to 99 map to the years 1900 to 1999.

Here, it's talking about the first parameter to new Date function.

var date = new Date(98, 0); // Thu Jan 01 1998 00:00:00 GMT-0800 (PST)

If you need year 0001 for some reason, use setFullYear(..).

date.setFullYear(1);
console.log(date); // Mon Jan 01 0001 00:00:00 GMT-0800 (PST)
Eric
  • 2,635
  • 6
  • 26
  • 66
  • from https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date tldr; this is unstable. new Date(98, 0) might give 2098, or might give 1998. (especially in future) new Date() exhibits legacy undesirable, inconsistent behavior with two-digit year values; specifically, when a new Date() call is given a two-digit year value, that year value does not get treated as a literal year and used as-is but instead gets interpreted as a relative offset — in some cases as an offset from the year 1900, but in other cases, as an offset from the year 2000. – MintWelsh Aug 17 '21 at 03:50
-1
use the following code 
var minDate = new Date("1/1/0001 12:00:00");
  minDate.toLocaleString();
Amit Singh
  • 77
  • 1
  • 8