2

Consider this...

var date = new Date(1901, 1, 1);

Result is February 1st, 1901. But why is it February and not January?

Matthew Layton
  • 39,871
  • 52
  • 185
  • 313
  • Based on the accepted answer, this is just straight up annoying! - I can express half my date in actual time, and the rest zero-indexed...what gives? – Matthew Layton Nov 09 '16 at 12:32
  • The javascript Date object was modelled on the Java Date, hence *getYear* had a two digit year even though the Y2K issue was looming so *getFullYear* was added. Zero indexed months are actually quite handy. – RobG Nov 10 '16 at 01:07

2 Answers2

2

Thats' because it is an old standard from the beginning of computing. You always start counting at 0. But still, I also find this, silly!

Matthew Layton
  • 39,871
  • 52
  • 185
  • 313
djxyz
  • 125
  • 4
-2

Because javaScript engine has written like this(zero based):

0 = January

1 = February

2 = March

3 = April

And so on...

javaScript is written this way. No wonder. That's it.

Jim Fahad
  • 635
  • 1
  • 8
  • 21
  • 3
    "Because it is" does not explain the why. – str Nov 09 '16 at 12:31
  • @str it kinda does, though, but just to be clear, because arrays are zero-indexed, something underlying will use an array, and the developers of the era were too damn lazy to subtract 1. – Matthew Layton Nov 09 '16 at 12:34
  • @str javaScript is written in C, and there dates are written by developers that way(zero based), instead of starting from zero, if the developers wrote it starts from 100, we need to do that as well. And it's a very old practice to start indexes from `zero`. Nothing hidden science here. – Jim Fahad Nov 09 '16 at 12:41
  • @JimFahad JavaScript is not written in C. It has several implementations in C/C++, though. No hidden science? Then why are months zero-indexed but days are not? And don't say it was influenced by other languages. While this is true, it still does not explain the why. – str Nov 09 '16 at 13:11