1

Given UTC date and time as year, month, day, hour, minute, second, I was wondering about a pseudo-code algorithm to calculate the corresponding Unix epoch timestamp.

There are 60 seconds in a minute, 60 minutes in an hour, 24 hours in a day. So we can just use 60 × 60 × 24 = 86400 seconds per day and count the days backwards until 1970-01-01T00:00:00Z, right? Months have different numbers of days, though. So it's probably better to skip that and use 365 days per year. But then, there's leap years... and leap seconds... and more?

Does that suddenly make the algorithm as complex as dealing with timezones or is it still fairly easy to describe?

finefoot
  • 9,914
  • 7
  • 59
  • 102
  • What ever language you are in has solved this problem many times since the dawn of time. Yes you will need maths – TheGeneral May 21 '22 at 12:20
  • related (although language-specific / C++): https://stackoverflow.com/q/7960318/10197418 – FObersteiner May 21 '22 at 12:24
  • 1
    For this century (approximately) see my answer at https://stackoverflow.com/questions/70994996/efficiently-format-dates-from-a-log-file-with-posix-tools . For UTC you can leave out the part about offset (subfield 7). Leap years aren't hard if you do it right, and Unix times ignore leapseconds. – dave_thompson_085 May 21 '22 at 12:52
  • 1
    It is not trivial - but isn't that complex, too. Check out how Python's [datetime](https://github.com/python/cpython/blob/54b5e4da8a4c6ae527ab238fcd6b9ba0a3ed0fc7/Lib/datetime.py#L63) module does it (`_ymd2ord` and a few preceding functions) – Marat May 21 '22 at 16:41

0 Answers0