37

What is the number of one second ticks between Unix time epoch (01 Jan 1970) and GPS time epoch (06 Jan 1980)?

I have seen multiple answers from several sources on the web. One camp claims the answer is 315964800, the other claims it is 315964819. I always thought it was 315964800, but now am not so sure.

I just found my software baseline has been using 315964819 for the last eight years. I have a hard time understanding how it could have been 19 seconds off and no one noticed it when we integrated our embedded devices with other devices.

I think that whoever put 315964819 in the code baseline must have mistakenly used a TAI offset (19 seconds).

From what I understand, Unix time does not include leap seconds, which would indicate to me that 315964800 is the number of ticks between the two epochs. Then I think about how Unix time handles the leap second. It simply repeats the tick count when there is a leap second inserted, and there were 19 leap seconds inserted between 1970 and 1980... I start to wonder if the repeated ticks matter. I do not think so, but someone in this code's history thought so, and it seemed to work....

The long and short of it is I am about to change a constant set in the dark ages of this product that has to do with timing, which is important for the platform, from what it had been to what I believe is more accurate, and I wanted some sort of thumbs-up from more knowledgeable people than me.

Can someone authoritative please step in here?

315964800 camp

315964819 camp

Also note that I'm only asking about Unix epoch to GPS epoch. I'm pretty sure we've got leap seconds since GPS epoch covered appropriately.

kmort
  • 2,848
  • 2
  • 32
  • 54
  • related: [How to get current date and time from GPS unsegment time in python](http://stackoverflow.com/q/33415475/4279) – jfs Mar 18 '16 at 13:41

4 Answers4

33

The different values you stated are caused by mixing up the 1970 to 1980 offset with leap seconds.
The correct offset value is 315964800 seconds.

Explanation:

UTC and GPS time deviate (on average) every 18 months by one additional second. This is called a leap second, introduced in UTC time base, necessary to adjust for changes in the earth's rotation.

GPS Time not adjusted by leap seconds.

Currently (2013) there is an offset of 16s:
GPS Time-UTC = 16 seconds

Unix time is a time format not a time reference. It represents the number of milliseconds (or seconds) since 1.1.1970 UTC. Ideally your system time is synchronized with UTC by a TimeServer (NTP).

To convert, and get your offset, you should use a fixed offset: (6.1.1980 UTC - 1.1.1970 UTC)

and THEN add the current value of GPS to UTC deviation (currently 16s). E.g make that value configurable, or read the current offset from a GPS device (they know the difference between UTC and GPS Time)

The different values you stated are caused by mixing up 1970 to 1980 offset with leap seconds. Dont do that, handle them separately.

This java program:

SimpleDateFormat df = new SimpleDateFormat();
df.setTimeZone(TimeZone.getTimeZone("UTC"));

Date x = df.parse("1.1.1970 00:00:00");
Date y = df.parse("6.1.1980 00:00:00");

long diff = y.getTime() - x.getTime();
long diffSec = diff / 1000;

System.out.println("diffSec= " + diffSec);

Outputs this value:

diffSec= 315964800

So this is the correct offset between 1.1.1970 UTC and 6.1.1980 UTC where GPS Time began. Then you have to correct further 16 seconds which were introduced since 6.1.1980 and today, to calculate the GPS Time of a current UTC time.

kmort
  • 2,848
  • 2
  • 32
  • 54
AlexWien
  • 28,470
  • 6
  • 53
  • 83
  • Hi Alex. Thank you for your response. I concur with everything you said, however, I am not worried about leap seconds after the GPS epoch began. My software already handles that appropriately. I just want to know what the **actual value** is for one second ticks from Unix epoch to GPS epoch. I'm pretty sure the right answer is 315964800, but would like some confirmation and/or some evidence this is correct or not. Also, there were 19 leap seconds between 1970 and 1980, but they were already baked in when GPS time started. – kmort Dec 11 '13 at 21:47
  • Yes the right answer is 315964800 for the offset between 1.1.1970 and 6.1.1980, Then you have to add further 16 (sixteen, not ninetteen) leap seconds are from 6.1.1980 till today (2013). Look also my updated answer – AlexWien Dec 13 '13 at 17:03
  • Hi Alex. I edited your answer a bit to move the program up. You are correct about leap seconds: from 1980 to 2013 there were 16, but there were also 19 from 1970 to 1980. Someone had included those 19 seconds in a constant in our source file when they should not have. I played the CM-blame-game and found the person who originally typed it all those years ago (he still works here). We discussed, and he said those 19 seconds were erroneous. Thank you for your Java program showing the same thing. – kmort Dec 13 '13 at 18:24
  • Hi Alex. My suggested edits were rejected, but I think it's important. Please take a look here and see what you think. http://stackoverflow.com/review/suggested-edits/3577909 I'd love to see your Java code first in this. When you've made your edits (if any) shoot me a comment and I'll accept your answer. :-) – kmort Dec 14 '13 at 01:31
  • I updated the answer, and moved the central sentecne to top, as you suggested. – AlexWien Dec 17 '13 at 11:32
  • @AlexWien: What value are you hoping to get by adding 16 seconds to `diffSec`? – jfs Sep 08 '14 at 13:25
  • @J.F.Sebastian by adding 16s in 2013, you got the UTC time, when having GPS time, asuming both values are expressed in (milli-)seconds since 1.1.1970. These 16 are the leap seconds introduced in UTC, since GPS time exists (1980) – AlexWien Sep 08 '14 at 18:37
  • 1
    @AlexWien yes, `TAI = GPS + 19 = UTC(2013) + 16 + 19`. But why do you add 16 to `diffSec` (posix timestamp, unrelated to GPS)? – jfs Sep 08 '14 at 18:38
  • @J.F.Sebastian You add this only if you get a timestamp in GPS Time, which is part of his question. – AlexWien Sep 08 '14 at 18:44
  • @AlexWien: *"if you get a timestamp in GPS Time"* -- where do you see a GPS timestamp here? Neither `315964800` nor `315964819` are GPS timestamps here. – jfs Mar 18 '16 at 13:39
  • @J.F.Sebastian Wheter you add or subtract depends on the direction of conversion. GPS->UTC or UTC->GPS. In my answer I did not state whether to add or subtract, i used the term "correct further 16 seconds". – AlexWien Mar 18 '16 at 17:39
  • @AlexWien : [I know how to perform UTC <-> GPS conversions](http://stackoverflow.com/a/33426779/4279). You won't get GPS timestamp if you add 17 (as of July 2015) to the current POSIX timestamp because GPS time uses a different epoch (as you know). GPS timestamp for 1980-01-06UTC is 0 (zero), not `315964800` (anyway, you shouldn't add 16 to it because the difference between GPS and UTC is zero at 1980-01-06). I understand the issue now: you do not want to add 16 to `diffSec`; you probably meant: `gps = utc - 315964800 + 16` instead (assuming UTC time from 2013) – jfs Mar 18 '16 at 18:24
15

What is the number of one second ticks between Unix time epoch (01 Jan 1970) and GPS time epoch (06 Jan 1980)?

There are at least two possible answers:

  1. What is POSIX timestamp for 1980-01-06 UTC? Answer: 315964800 (exactly), in Python:

    from datetime import datetime, timedelta
    print((datetime(1980,1,6) - datetime(1970,1,1)) // timedelta(seconds=1))
    

    It is the number of SI seconds between the dates not counting leap seconds. In other words, the code shows how many "UT1" seconds (~1/86400 of a mean-solar day) passed between the events.

    UTC, GPS time scales tick in SI seconds. UTC "forgets" leap seconds and therefore the actual number of SI seconds between the dates is slightly larger than the POSIX timestamp.

    315964800 is not the correct answer if you want to find elapsed seconds

  2. How many SI seconds elapsed between 1970-01-01 UTC and 1980-01-06 UTC? Answer: 315964811 (approximately).

To answer the second question, you need to know how many intercalary leap seconds were inserted between the two dates (convert UTC to the International Atomic Time (TAI)):

#!/usr/bin/env python3
from datetime import datetime, timedelta

tai_posix_epoch = datetime(1970, 1, 1) + timedelta(seconds=8, microseconds=82)
tai_gps_epoch = datetime(1980, 1, 6) + timedelta(seconds=19)
print(round((tai_gps_epoch - tai_posix_epoch) / timedelta(seconds=1)))

The difference between TAI and GPS time is constant within 10s of nanoseconds.

The time between 1970 and 1972 (when UTC was introduced) is a little fuzzy; the TAI-UTC difference is not integer number of seconds in that period:

from decimal import Decimal as D

MJD_1970_01_01 = 40587
dAT_1970_01_01 = D("4.213170") + (MJD_1970_01_01 - 39126) * D("0.002592")
# -> delta(AT) = TAI - UTC = Decimal('8.000082') # 8 seconds, 82 microseconds

Here's a picture that shows the relation between UT1, UTC, and TAI time scales over the years: UTC-TAI Each step is a leap second starting with TAI - UTC = 10s on 1972-01-01. 26 positive leap seconds had been inserted as of 1 July 2015.


315964819 timestamp could be explained if 1970-01-01 00:00:00 TAI epoch is used:

print(datetime(1970, 1, 1) + timedelta(seconds=315964819)) # TAI
# 1980-01-06 00:00:19 TAI or 1980-01-06 00:00:00 UTC

i.e., exactly 315964819 SI seconds elapsed between 1970-01-01 00:00:00 TAI and 1980-01-06 00:00:00 UTC (note: the dates are expressed using different time scales).

"right" timezones use 1970-01-01 00:00:10 TAI epoch (notice: 10 seconds) and therefore the corresponding timestamp for the GPS epoch (1980-01-06 00:00:00 UTC) is 315964809 (not 315964819). Here's a succinct description of the difference between "right" and POSIX timestamps:

The "right" files in the tz (zoneinfo) database have a subtle difference from the POSIX standard. POSIX requires that the system clock value of time_t represent the number of non-leap seconds since 1970-01-01. This is the same as requiring POSIX seconds to be mean solar seconds of UT, not the atomic seconds that UTC has counted since 1972-01-01.

The "right" zoneinfo files assert that the system clock value of time_t represent the actual number of seconds in the internationally approved broadcast time scale since 1970-01-01. As a result the value of time_t which is expected by the "right" zoneinfo files is greater than the value of time_t specified by POSIX. The difference in the values of time_t is the number of leap seconds which have been inserted into the internationally approved broadcast time scale. As of year 2015 the difference is 26 seconds.emphasize is mine


Can someone authoritative please step in here?

IERS BULLETIN C (the data that I've used above) is the authority on leap seconds (and therefore (indirectly) on the difference between UTC and GPS time scales).

jfs
  • 399,953
  • 195
  • 994
  • 1,670
  • A very precise answer, for a less precise question. One would have to know what exactly his programm needs to calculate. – AlexWien Mar 18 '16 at 17:54
  • @AlexWien I was converting between current UTC time and current GPS time and needed to know the offset. So J.F. Sebastian's #1 answer is correct. – kmort Aug 12 '16 at 20:29
6

I guess I'm in the third camp :)
Let's call it like it is:
2,904,548,141,415,381,930 "periods[...]of a caesium 133 atom" measured at 0 degrees Kelvin at the geoid. (give or take a few hundred million periods depending on which TAI/SI definition you use)

Short Answer:

It depends on what time scales (and which definitions of those time scales) you're using.

315964809 in TAI seconds (1977 definition) and thus UTC seconds
315964800 in UNIX seconds
(both are equal to eachother but ONLY between your specified dates and both correspond to 2,904,548,141,415,381,930 "periods[...]")
Please note that UNIX seconds replay the same second after the completion of a UTC leap second, so that the UTC seconds, 2012-06-30 23:59:60 UTC and 2012-07-01 00:00:00 UTC, were both represented by a UNIX timestamp of 1341100800.

Detailed Answer:

Using TAI seconds
Even though they aren't really, let's assume that all TAI seconds before 1977 are still exactly equal to the 1977/1997 definition of TAI/SI seconds.
Let's also assume that by
"Unix time epoch (01 Jan 1970)" to "GPS time epoch (06 Jan 1980)"
you mean
1970-01-01 00:00:10 TAI to 1980-01-06 00:00:19 TAI
in this case there would be
( ( (365days/year * 10years) + 2 leap days + 5 days) * 86400 TAI seconds/day ) + 9 TAI seconds
= 315964809 TAI seconds

Using UNIX seconds
Even though they aren't really, let's assume that the duration of a UTC second before 1977 is still exactly equal to the 1977/1997 definition of a TAI/SI second.
Let's also assume that by
"Unix time epoch (01 Jan 1970)" to "GPS time epoch (06 Jan 1980)"
you mean
1970-01-01 00:00:00 UTC to 1980-01-06 00:00:00 UTC
and that UNIX time skips back a second after the completion of a leap second
in this case there would be
( ( (365days/year * 10years) + 2 leap days + 5 days) * 86400 seconds/day ) + 9 leap seconds - 9 unix leap second rehashes
= 315964800 UNIX seconds

Concerning "Periods[...]"
A 1977/1997 TAI/SI second is what was used to come up with 315964809 seconds of 9,192,631,770 periods each = 2,904,548,141,415,381,930 periods. An 1997 SI second is equal to the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom at rest at a temperature of 0 K. The 1977 definition of TAI measures SI seconds at the geoid.

chimeraha
  • 421
  • 5
  • 7
  • Wow. Thanks for the background information. We ended up using 315964800 and things have worked smoothly. :-) – kmort Nov 03 '14 at 13:45
  • the answer is misleading. More digits does not mean more precise or accurate result. I see no connection between 19 digits of the number of periods (is it even correct according to some common definition?) and the provided timestamp values (the timestamps look reasonable though). `315964800` result for POSIX time follows from definitions (exact value). `1980-01-06 00:00:19 TAI` (GPS Epoch) is also exact (by definition). `1972-01-01 00:00:00 UTC == 1972-01-01 00:00:10 TAI` but the value `1970-01-01 00:00:10 TAI` (Olson's tz database uses it for "right" zoneinfo) is suspect if you use 19 digits – jfs Feb 07 '15 at 20:59
  • 1
    Yes this answer is definitely a bit confusing (but for good reason, because all these different time scales and their own redefinitions over time are EXTREMELY misleading); I was just attempting to cover ALL the bases. ---- I think people just need to keep in mind is that UTC and POSIX account for leap seconds (in their own unique ways) while TAI and GPS do not. So as long as you can keep a good detailed definition of what you actually mean by "one second ticks" and keep all the conversions/math straight and accurate, you will be able to come up with a precise answer for your purposes. – chimeraha Feb 08 '15 at 23:50
  • Yes, I think both of your answers (your and sebastian's) are a bit misleading, altough correct they are not very helpfull as they leave more questions open than.the answer. As the comment above says, that his question "one seconds ticks" is not perfectly precise. Ususllay people want to convert timestamp values. – AlexWien Mar 18 '16 at 18:08
  • Huh. There I was, all ready to nitpick about nothing much happening at 0° K, and then I looked it up and learned that people do indeed use 0° K as the background temperature for their wriggling caesium 133 atoms: http://www.bipm.org/utils/common/pdf/si_brochure_8_en.pdf Thanks, @chimeraha. – Christian Severin Sep 27 '17 at 09:58
1

go back to the orginal question, "Unix time epoch (01 Jan 1970)" to "GPS time epoch (06 Jan 1980)" , so the epoch is 315964800, 315964819 is the 'TAI epoch' to the 'GPS time epoch'. which means 315964819= 315964800 + 19. so the epoch value you use in the code, is really depending on which time epoch you are using.