0

I was trying to understand if it's safe to execute (new Date()).getTime() in different timezones.

By reading this question, it seems that it actually is time zone independent (assuming that the time on the machine where the script is executed was set correctly).

However, since it's the number of milliseconds since 1/1/1970, if we execute the same script from the next time zone, why it doesn't differ by 3600*1000 (the number of milliseconds in one hour)?

justHelloWorld
  • 6,478
  • 8
  • 58
  • 138
  • Try `(new Date('1970-01-01')).getTime()` from any computer. The result should be `0`. Even if it displays time difference when represented in your computer's timezone, internally, it is still the same. – 31piy Jul 23 '18 at 08:29

2 Answers2

3

You're referring to UTC. It is always based on GMT time, so no matter where you execute the code from, GMT time is consistent.

Liftoff
  • 24,717
  • 13
  • 66
  • 119
  • UNIX != UTC != GMT. They are very different things. – mortware Jul 23 '18 at 09:10
  • UNIX time is simply a starting point for a time represented by a large number (milliseconds since 1970). You should only consider this as a time format, or a way of manipulating the time. UTC is the consistent time that should be used across time boundaries. This is what `getTime()` gives you. GMT is a British timezone that happens to be the same as UTC, but you shouldn't use it unless you're working with that particular timezone. It only represents a geographical timezone for 6 months of the year. – mortware Jul 23 '18 at 09:20
2

Because it uses UTC
From MDN

getTime() always uses UTC for time representation. For example, a client browser in one timezone, getTime() will be the same as a client browser in any other timezone.

barbsan
  • 3,418
  • 11
  • 21
  • 28