I'm experiencing a problem where a displayed date has the millisecond component multiplied by 10.
Specifically, the time 52.050
is being shown as 52.50
when a .S
format is used, but 52.050
when a .SSS
format is used.
Take the following code example:
// Some arbitrary point with 50 milliseconds
final Date date = new Date(1620946852050 l);
final LocalDateTime localDateTime = LocalDateTime.ofInstant(date.toInstant(), ZoneId.systemDefault());
final String format = "%-40s%-20s%-20s%n";
System.out.format(format, "Date Formatter", "Date Format", "Formatted Output");
Stream.of("HH:mm:ss", "HH:mm:ss.S", "HH:mm:ss.SS", "HH:mm:ss.SSS").forEach(dateFormat - > {
System.out.println();
System.out.format(format, SimpleDateFormat.class.getName(), dateFormat,
new SimpleDateFormat(dateFormat).format(date));
System.out.format(format, DateTimeFormatter.class.getName(), dateFormat,
DateTimeFormatter.ofPattern(dateFormat).format(localDateTime));
});
This produces an output of:
Date Formatter Date Format Formatted Output
java.text.SimpleDateFormat HH:mm:ss 00:00:52
java.time.format.DateTimeFormatter HH:mm:ss 00:00:52
java.text.SimpleDateFormat HH:mm:ss.S 00:00:52.50
java.time.format.DateTimeFormatter HH:mm:ss.S 00:00:52.0
java.text.SimpleDateFormat HH:mm:ss.SS 00:00:52.50
java.time.format.DateTimeFormatter HH:mm:ss.SS 00:00:52.05
java.text.SimpleDateFormat HH:mm:ss.SSS 00:00:52.050
java.time.format.DateTimeFormatter HH:mm:ss.SSS 00:00:52.050
I've used both java.util.Date
and java.time
to illustrate the unexpected behaviour, and I'm aware that java.time
is better, but I'd still like to understand the SimpleDateFormat
behaviour.
I'm running Java 14.0.2.12, but can reproduce in 11.0.10.9.