I think the following program should output the seconds to 1970 for the first day of every year from 1AD to 1970, preceded by the size of time_t
on the system it's compiled on (CHAR_BIT
is a macro so I think you can't just copy the compiled executable around and assume it's correct though in practice everything uses 8 bit char
s these days).
#include <limits.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <time.h>
void do_time(int year)
{
time_t utc;
struct tm tp;
memset(&tp, 0, sizeof(tp));
tp.tm_sec = 0;
tp.tm_min = 0;
tp.tm_hour = 0;
tp.tm_mday = 1;
tp.tm_mon = 0;
tp.tm_year = year - 1900;
tp.tm_wday = 1;
tp.tm_yday = 0;
tp.tm_isdst = -1;
printf("%d %ld\n",year, mktime(&tp));
}
int main(){
printf("time_t is %lu bits\n",sizeof(time_t)*CHAR_BIT);
for (int i = 1; i<1971; i++)
do_time(i);
exit(0);
}
However on OS X (10.11.3 15D21) this only works for years >= 1902, despite time_t
being 64 bit signed. I could potentially understand if the programmers at Apple were lazy and didn't support any years before 1970, but correct behaviour going back to 1902 and then stopping looks more like an error on my part.