My goal is to parse a date string with the format yyyy-MM-dd hh:mm:ss.SSS
for a given timezone, using only Java 6 without any external libraries. My first approach was to configure a SimpleDateFormatter
with a timezone. However, I'm not able to understand the results. Have a look at the following code:
List<String> zones = Arrays.asList("UTC", "CET", "Europe/London", "Europe/Berlin");
for(String zone: zones) {
SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd hh:mm:ss.SSS");
df.setTimeZone(TimeZone.getTimeZone(zone));
Date result = df.parse("1970-01-01 00:00:00.000");
System.out.println(zone + ": " + result.getTime());
}
The output is as follows:
UTC: 0
CET: -3600000
Europe/London: -3600000
Europe/Berlin: -3600000
The first two results are as expected. In UTC
, the unix epoch starts at exactly zero milliseconds, and there is a one hour difference to CET
.
What I don't understand are the values for London and Berlin. Since the milliseconds for London equal the milliseconds for CET
instead of UTC
, I first assumed that daylight savings time is taken into account (since it's currently summer on the northern hemisphere and summertime in London is UTC+1
, which equals CET
). However, why is the value for Berlin the same? Shouldn't it be UTC+2
(or CET+1
if you want)?
My questions:
- What's the explanation for the results? Is
DST
really taken into account? - Is there a better way for doing the conversion in Java 6?