If you let me relax the overall technical accuracy (while hopefully not saying anything blatantly wrong), I think JavaScript Date
object is better understood if you think of it as a variable that stores an absolute specific point in time (a Unix timestamp if you like). This internal value does not have time zone information attached because it isn't essential (it's an absolute value after all) and it's everything that JavaScript needs... until it has to interact with the outside world, i.e., parse or generate a human-readable date.
Such dates are always local, thus need a time zone. And the approach used by JavaScript creators is to use two time zones:
- UTC
- Whatever time zone is configured as default in the computer where JavaScript code runs at
And the funny thing is that which one gets chosen depends on the method involved. If you call .getFullYear() you get local time, if you call .getUTCFullYear() you get UTC. Not too bad, is it? Well, you also have .toDateString(), .toGMTString(), .toISOString()... Could you tell which time zone each uses without looking at docs? Or even after looking at docs? And, hey, date constructor can actually use both, depending on the string and browser!
Said that, it's easier to understand that your two code samples point to very specific moments in time. Magic happens when you call .getDate(), which is expected to use local time. Your computer is apparently located somewhere in Europe. As per most local legislations, there's daylight saving time in effect during several months and the way it's technically implemented is by actually switching between two related by different time zones: CET (Central Europe Time) and CEST (Central Europe Summer Time). JavaScript engine is smart enough to realise that local time zone changes and is able to pick the correct one for the absolute moment in time stored in the object.
If you're curious, this is possible because the browser has a database with up-to-date time zone transitions. Such database is a valuable resource shared by many programs.