2

I have a message being received through Kafka, which I know has a non-UTC timezone in it. When I use the org.apache.kafka.common.serialization.StringDeserializer to verify this I get the right timestamp in ISO 8601 format with the timezone:

{  "id": "e499f2e8-a50e-4ff8-a9fe-0eaf9d3314bf", "sent_ts": "2021-02-04T14:06:10+01:00" }

When I switch to the org.springframework.kafka.support.serializer.JsonDeserializer this is lost. My POJO looks like this:

public class MyMessage {

    @JsonProperty("id")
    private String id;

    @JsonProperty("sent_ts")
    private OffsetDateTime sentTs;

    @Override
    public String toString() {
        return "MyMessage{" +
                "id='" + id + '\'' +
                ", sentTs=" + sentTs +
                '}';
}

When I log the message I receive I get:

MyMessage{id='e499f2e8-a50e-4ff8-a9fe-0eaf9d3314bf', sentTs=2021-02-04T13:06:10Z}

I thought the JsonDeserializer must be using Jackson so in my application.yml configuration I set:

spring.jackson:
    deserialization.ADJUST_DATES_TO_CONTEXT_TIME_ZONE: false

This didn't work. I also tried to a customizer:

@Configuration
public class ObjectMapperBuilderCustomizer implements Jackson2ObjectMapperBuilderCustomizer {

    @Override
    public void customize(Jackson2ObjectMapperBuilder builder) {
        builder.modules(new JavaTimeModule());
        builder.featuresToDisable(DeserializationFeature.ADJUST_DATES_TO_CONTEXT_TIME_ZONE);
    }
}

Which didn't work either.

I though maybe it needs to be a property of the Kafka consumer, so I also tried:

spring:
    consumer:
      auto-offset-reset: earliest
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
      properties:
        spring.jackson.deserialization.ADJUST_DATES_TO_CONTEXT_TIME_ZONE: false

Still doesn't work.

Is there a way to make the JsonDeserializer work properly and keep the correct timezone offset?

jbx
  • 21,365
  • 18
  • 90
  • 144

1 Answers1

2

When you do like this value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer, the instance of that class is created by Apache Kafka client code which is fully not aware of Spring configuration.

If you'd like to rely on the ObjectMapper configured by Spring Boot and your customizations, you should consider to do something like this:

@Bean
DefaultKafkaConsumerFactory kafkaConsumerFactory(KafkaProperties properties, ObjectMapper objectMapper) {
   Map<String, Object> consumerProperties = properties.buildConsumerProperties();
   JsonDeserializer<Object> jsonDeserializer = new JsonDeserializer<>(objectMapper);
   jsonDeserializer.configure(consumerProperties, false);

   return new DefaultKafkaConsumerFactory(consumerProperties, 
                   new StringDeserializer(), jsonDeserializer);
}

Pay attention how I call jsonDeserializer.configure(consumerProperties, false);. This way you still will be able to configure the rest of properties for Kafka consumer in the applicaiton.yml.

Please, consider to raise GH issue for Spring Boot, so we will revise how we deal with JsonDeserializer and auto-configured ObjectMapper to server better end-user experience.

Artem Bilan
  • 113,505
  • 11
  • 91
  • 118
  • This looks good, but then the `JsonDeserializer` will ignore the configuration, like the json default type, from the `application.yml`, am I right? – jbx Feb 05 '21 at 17:12
  • That's true, but I'm not sure what is wrong with using setters of this `JsonDeserializer` class... – Artem Bilan Feb 05 '21 at 17:18
  • Well it will be confusing if you set the settings in the yaml file and they don't work. Is there a way to get the default `JsonDeserializer` with the right configuration loaded, and then changing its `ObjectMapper` configuration? I just need to turn this Time Zone offset thing off. – jbx Feb 05 '21 at 17:24
  • Well, for me its more confusing to configure Java object in the yaml. I really wish we wouldn't introduce such a configuration at the fist place... It is really look more natural to have objects configured via Java code, especially where there is a dependency between them. – Artem Bilan Feb 05 '21 at 17:52
  • In the end I think you can call that `configure(properties.buildConsumerProperties(), false)` on this `JsonDeserializer` to simulate Kafka client logic. – Artem Bilan Feb 05 '21 at 17:52
  • I disagree with your configuration in code concept. The whole point of Spring Boot is to remove boiler plate code and have code dedicated to your business logic only as much as possible. But anyway, irrespective of our opinions, the functionality is there and it would be unwise to turn it off with a line somewhere hidden in the code, it will just confuse developers that try to maintain our code. – jbx Feb 06 '21 at 07:47
  • Yes I think your solution works! Can you update your answer and add this too? – jbx Feb 06 '21 at 07:47
  • I've updated my answer to show `configure()` trick. Also I ask you to raise GH issue, so we will honor your request this or other way. – Artem Bilan Feb 08 '21 at 15:03
  • Good idea re github issue. – jbx Feb 08 '21 at 21:36
  • I've added this solution to my Spring Boot/Cloud Stream application, but it is still not resolving deserialization errors that my custom Jackson mixin fix. I can step into the bean creation and see that the injected ObjectMapper does have my mixin, but it's somehow not being used during Kafka value deserialization as expected. I've also tried 2 other variations mentioned in the related GitHub issue. Any ideas what to look at? – E-Riz Aug 26 '21 at 17:31