519

Why is

1 January 1970 00:00:00

considered the epoch time?

cHao
  • 84,970
  • 20
  • 145
  • 172
rahul
  • 184,426
  • 49
  • 232
  • 263

5 Answers5

479

Early versions of unix measured system time in 1/60 s intervals. This meant that a 32-bit unsigned integer could only represent a span of time less than 829 days. For this reason, the time represented by the number 0 (called the epoch) had to be set in the very recent past. As this was in the early 1970s, the epoch was set to 1971-01-01.

Later, the system time was changed to increment every second, which increased the span of time that could be represented by a 32-bit unsigned integer to around 136 years. As it was no longer so important to squeeze every second out of the counter, the epoch was rounded down to the nearest decade, thus becoming 1970-01-01. One must assume that this was considered a bit neater than 1971-01-01.

Note that a 32-bit signed integer using 1970-01-01 as its epoch can represent dates up to 2038-01-19, on which date it will wrap around to 1901-12-13.

ilkkachu
  • 6,221
  • 16
  • 30
Matt Howells
  • 40,310
  • 20
  • 83
  • 102
  • 38
    Does 1/60 have anything to do with the frequency of the American power net? – xtofl Jul 07 '09 at 09:23
  • 68
    It's the frequency of one of the oscillators on the system boards used at the time. It wasn't necessary for the oscillator to be 60Hz since it ran on DC, but it was probably cheap to use whatever was most common at the time, and TVs were being mass-produced then... – Matt Howells Jul 07 '09 at 10:32
  • 19
    Actually, at the time, it was very common for computer clocks as well as RTCs to be synchronised with the US mains waveform because it was (is?) very reliable. It was multiplied to get the processor clock, and divided to get seconds for the RTC. – Alexios Jun 17 '12 at 21:03
  • 6
    @MattHowells - why don't they upgrade the epoch to 2000 ? – Jedi Knight Mar 17 '13 at 09:10
  • 14
    @mafioso: Right, I'll set a reminder on my laptop for 2038-... 1901-12-13. –  Apr 01 '14 at 03:12
  • @JediKnight - Newer languages, such as [Fantom](http://fantom.org/) do! From their [DateTime](http://fantom.org/doc/sys/DateTime.html) class: "Fantom time is normalized as nanosecond ticks since 1 Jan 2000 UTC" – Steve Eynon Apr 29 '14 at 23:14
  • 20
    @JediKnight This is speculation based on my own experiences as a developer: changing a standard takes time, and if your change doesn't take hold then you end up with [competing standards](http://xkcd.com/927/). The real solution to the epoch problem is 64-bit integers, not moving the epoch forward in time. – Jake Jan 30 '15 at 19:42
  • @Matt Howells, Could you get me url to this official standart, please! – neo Jun 08 '15 at 10:31
  • @JediKnight Not to mention, there's not much of a reason to change the epoch time. Sure, you could save a few extra bytes, but would the end user notice or even care? Plus, if you do change the epoch time, then every single program and library that relies on the timestamps being in reference to the 1970 time has to be updated. –  Dec 14 '15 at 23:58
  • 1
    @JediKnight Or it would be nice if it were 0000-1-1... except there was no year 0 between 1BC and 1AD. Or maybe the time of the big bang... except we only know when that was to ±21 million years, and we might overflow a few integers. So maybe the decade that we developed computers is OK after all. – James Feb 26 '16 at 16:36
  • 2
    I'm expecting that in far future, earthlings will believe that the universe was created on 1 January 1970 00:00:00. – Jus12 Mar 04 '16 at 20:09
  • 2
    `> ...a 32-bit signed integer ...will wrap around to 1901-12-13."` Am I missing something, is this not integer overflow and hence technically undefined (or more pragmatically, platform dependent)? [Year 2038 Problem on Wikipedia concurs with @MattHowells](https://en.wikipedia.org/wiki/Year_2038_problem) `> Times beyond that will "wrap around" and be stored internally as a negative number, which these systems will interpret as having occurred on 13 December 1901 rather than 19 January 2038.` Thoughts? – Yeow_Meng Apr 20 '16 at 16:55
  • 1
    @JediKnight Why not change the epoch to tomorrow? – Michael Jul 13 '16 at 22:41
  • @Michael It will break lot of existing systems – Pradeep Dec 29 '16 at 22:48
  • @Pradeep right, but JN was asking why they don't upgrade the epoch to 2000... if you're going to upgrade and break things, why not make it closer to today so at least it lasts longer? – Michael Dec 29 '16 at 23:03
  • 1
    @Michael I think there are better solutions, upgrading epoch to 2000 means our grad children will face the same problem, I would rather use 64 bit time_t and update old 32 bit systems to handle 64 bit time refer: https://en.wikipedia.org/wiki/Year_2038_problem#Solutions – Pradeep Dec 30 '16 at 01:24
  • 1
    I think the biggest thing people forget about updating it, isn't just that it represents "now" but it's stored in databases, files, etc and all those would have to be fixed, it's literally the worst possible idea other than changing it to letters (facetiously). That's why the *only* option is to make it unsigned as a hack or preferably give it 64bits of room to work with. Easier to change that than to edit petabytes of data. – simontemplar Feb 02 '18 at 22:37
  • The comment [https://stackoverflow.com/questions/1090869/why-is-1-1-1970-the-epoch-time#comment904870_1090878] by @Jörg W Mittag below also mentions the development year as a factor in the decision – gawkface Aug 22 '18 at 04:53
  • Epoch's definition always felt about right to me, but since I was born a few days in 1970, maybe I'm biased (by a few days, obviously). – Georges Mar 02 '19 at 17:54
  • 2
    @Alexios. This is **utterly not true**. Computers have never in the history used the 60hz AC mains frequency as clock source. Quarz crystal oscillator, which are the first precise clocks since 1930, are referenced to the natural frequency of a quarz crystal due to the piezoelectric effect: 32768 Hz. Westinghouse engineers (1891), with and without Tesla, choose 60Hz (1891) (and 30Hz, 1893) as the best option for AC frequency, among he choices of 41.66Hz (General Electric, 1893) and 50Hz in Europe (AEG, Berlin, 1891). – Brethlosze Apr 24 '21 at 17:48
  • Props to any engineer who, tasked with addressing Y2K problems, and thus fully aware of the stress and concern such an issue presented, went "OK, this will fix it... till 2038... I mean I could just use an unsigned... but f*** it that'll be someone else's problem" – matt Jun 24 '22 at 12:14
58

History.

The earliest versions of Unix time had a 32-bit integer incrementing at a rate of 60 Hz, which was the rate of the system clock on the hardware of the early Unix systems. The value 60 Hz still appears in some software interfaces as a result. The epoch also differed from the current value. The first edition Unix Programmer's Manual dated November 3, 1971 defines the Unix time as "the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second".

Stu Thompson
  • 38,370
  • 19
  • 110
  • 156
18

Epoch reference date

An epoch reference date is a point on the timeline from which we count time. Moments before that point are counted with a negative number, moments after are counted with a positive number.

Many epochs in use

Why is 1 January 1970 00:00:00 considered the epoch time?

No, not the epoch, an epoch. There are many epochs in use.

This choice of epoch is arbitrary.

Major computers systems and libraries use any of at least a couple dozen various epochs. One of the most popular epochs is commonly known as Unix Time, using the 1970 UTC moment you mentioned.

While popular, Unix Time’s 1970 may not be the most common. Also in the running for most common would be January 0, 1900 for countless Microsoft Excel & Lotus 1-2-3 spreadsheets, or January 1, 2001 used by Apple’s Cocoa framework in over a billion iOS/macOS machines worldwide in countless apps. Or perhaps January 6, 1980 used by GPS devices?

Many granularities

Different systems use different granularity in counting time.

Even the so-called “Unix Time” varies, with some systems counting whole seconds and some counting milliseconds. Many database such as Postgres use microseconds. Some, such as the modern java.time framework in Java 8 and later, use nanoseconds. Some use still other granularities.

ISO 8601

Because there is so much variance in the use of an epoch reference and in the granularities, it is generally best to avoid communicating moments as a count-from-epoch. Between the ambiguity of epoch & granularity, plus the inability of humans to perceive meaningful values (and therefore miss buggy values), use plain text instead of numbers.

The ISO 8601 standard provides an extensive set of practical well-designed formats for expressing date-time values as text. These formats are easy to parse by machine as well as easy to read by humans across cultures.

These include:

Basil Bourque
  • 303,325
  • 100
  • 852
  • 1,154
11

http://en.wikipedia.org/wiki/Unix_time#History explains a little about the origins of Unix time and the chosen epoch. The definition of unix time and the epoch date went through a couple of changes before stabilizing on what it is now.

But it does not say why exactly 1/1/1970 was chosen in the end.

Notable excerpts from the Wikipedia page:

The first edition Unix Programmer's Manual dated November 3, 1971 defines the Unix time as "the time since 00:00:00, Jan. 1, 1971, measured in sixtieths of a second".

Because of [the] limited range, the epoch was redefined more than once, before the rate was changed to 1 Hz and the epoch was set to its present value.

Several later problems, including the complexity of the present definition, result from Unix time having been defined gradually by usage rather than fully defined to start with.

Community
  • 1
  • 1
Dawie Strauss
  • 3,706
  • 3
  • 23
  • 26
-11

Short answer: Why not?

Longer answer: The time itself doesn't really matter, as long as everyone who uses it agrees on its value. As 1/1/70 has been in use for so long, using it will make you code as understandable as possible for as many people as possible.

There's no great merit in choosing an arbitrary epoch just to be different.

Naveen
  • 74,600
  • 47
  • 176
  • 233
pauljwilliams
  • 19,079
  • 3
  • 51
  • 79
  • 1
    But why 1970 is chosen as the year. – rahul Jul 07 '09 at 07:43
  • 42
    Because Unix was developed in 1969 and first released in 1971 and it was thus reasonable to assume that no machine would have to represent a system time earlier than 1970-01-01-00:00:00. – Jörg W Mittag Jul 07 '09 at 07:57
  • 3
    As a developer of historical simulation games, it seems pretty silly that the designers of some time objects tend to assume all programs will only want to represent dates in the future or recent past. Of course, we can program our own representations, or work in an adjustment factor, but still. – Dronz Feb 06 '13 at 22:51
  • 8
    The same pre-epoch issue applies to "practical" non-game uses, such as business spreadsheets, scientific data presentation, time machine UIs, etc. – Lenoxus Mar 03 '14 at 15:51
  • 8
    @Dronz You have to take into account this was designed for a $72000 computer with 9KB of RAM which used transistors and diodes for logic gates as a CPU (no chips at the time!). So it wasn't "silly" to make the most basic thing that worked. – Camilo Martin Mar 05 '15 at 17:38
  • @CamiloMartin I understand why that was used circa 1970, and for types that will only ever represent computer system times. It's just unfortunate it makes those libraries less generally useful for other applications. – Dronz Mar 05 '15 at 19:13
  • @Dronz I actually find the notion of "seconds since 1970" usable enough (sortable, simple, easy to do arithmetics with and can be stored in an int), but when it's not enough, most languages have a better encapsulation for dates and times. It's just still widely used because people like to use it, but an alternative is always around the corner. – Camilo Martin Mar 15 '15 at 15:24
  • 3
    OP is correct on a meta-level, the schemes we work out for time have always been fairly arbitrary. The number of days in a year, number of days in the month, year "0", and the rules for leaps year are ... crazy. Every system is just a series of crappy compromises made because it was the best they could do with the available technology and worked well enough for their immediate use-case. Which is true of all engineering projects : ) – Indolering Oct 21 '16 at 23:03