In Java via the Postgres JDBC driver, I'm running a simple query to select a very long ago datetime: select '0002-02-02 00:00:00'::timestamp
. In my Java application a TimeStamp
is being created with ResultSet.getTimestamp(idx, calendar)
where calendar
is a Calendar
created by calling Calendar.getInstance(TimeZone.getTimeZone("UTC"))
which looks something like this:
The Postgres driver does a bit of logic internally when getTimestamp
is called where it appears to only use the timezone of the passed in calendar when creating the result Timestamp
object. This timestamp appears reasonable in the debugger, and when calling getTime()
on the Timestamp
to get the number of milliseconds since the epoch, it returns -62101468800000
:
However, if I use this same value in Python, I get a datetime that is two days different:
from datetime import datetime, timezone
num_millis=-62101468800000
num_secs=num_millis/1000
# Using the same # of seconds since the epoch as in Java, datetime is different by two days
dt = datetime.fromtimestamp(num_secs, tz=timezone.utc)
print("Datetime", dt) # Datetime 0002-01-31 00:00:00+00:00
print("Num seconds", dt.timestamp(), "\n") # Num seconds -62101468800.0
# Using the same datetime string, seconds since epoch is different by two days worth of seconds
dt2 = datetime.fromisoformat('0002-02-02 00:00:00+00:00')
print("Datetime", dt2) # Datetime 0002-02-02 00:00:00+00:00
print("Num seconds", dt2.timestamp()) # Num seconds -62101296000.0
Why is there a discrepancy between the Python code and Java code? (Is it possibly due to different calendar systems?) Is there a way to resolve this discrepancy?