I'm trying to grok time-handling, and I've stumbled upon something in Java that has me somewhat baffled. Take this sample code:
public static void main(String[] args)
{
//Calendar set to 12:00 AM of the current day (Eastern Daylight Time)
Calendar cal = Calendar.getInstance(TimeZone.getTimeZone("GMT-4"));
cal.set(Calendar.HOUR_OF_DAY, 0);
cal.set(Calendar.MINUTE, 0);
cal.set(Calendar.SECOND, 0);
cal.set(Calendar.MILLISECOND, 0);
/////
//Calendar set to 12:00 AM of the current day (UTC time)
Calendar utcCal = Calendar.getInstance(TimeZone.getTimeZone("GMT"));
utcCal.set(Calendar.HOUR_OF_DAY, 0);
utcCal.set(Calendar.MINUTE, 0);
utcCal.set(Calendar.SECOND, 0);
utcCal.set(Calendar.MILLISECOND, 0);
/////
long oneHourMilliseconds = 3600000;
System.out.println((cal.getTimeInMillis() - utcCal.getTimeInMillis()) / oneHourMilliseconds);
}
I visualize the algorithm for calculating the time represented by cal
taking 1 of 2 forms:
- Calculate the number of milliseconds from the Epoch, add offset (add -4)
- Calculate the number of milliseconds from (Epoch + offset). So # of milliseconds from (
Epoch - 4 * oneHourMilliseconds
).
Both of these algorithms should yield a result that is 4 hours behind that of utcCal
, however running the code returns 4
.
Can someone explain to me why cal
, despite being set to a time zone 4 hours behind that of utcCal
, ends up having a millisecond value 4 hours after that of utcCal? Shouldn't the code be returning -4
?