3

Using Jackson to deserialize a JSON response into a DTO.

For a given response:

[ {
    foo : "bar",
    value : "hello world"
},
{
    foo : "bar",
    value : "hello world"
},
{
    foo : "bar",
    value : "hello world"
},
{
    foo : "bar",
} ]

You can see that all the objects in this array but one of them is inconsistent in the schema (the last one).

After looking at some other related questions on StackOverflow:

deserializing json using jackson with missing fields

Ignore null fields when DEserializing JSON with Gson or Jackson

They still create an object from that irregular JSON object.

Which means I need to then iterate through this list and delete any objects that do not have the attribute "value" by implementing a cleaning method.

What I want to know is can Jackson do this business logic for me?

If this JSON object has all the required fields
Then create a new DTO
Else
Skip to next JSON object in response

My DTO with Jackson annotations:

    @JsonIgnoreProperties(ignoreUnknown = true)
    @JsonInclude(JsonInclude.Include.NON_NULL)
    public final class FooDTO {

        private final String foo;
        private final String value;

        /**
         * Constructor
         *
         * @param foo           String
         * @param value String
         */
        @JsonCreator
        public FooDTO(@JsonProperty("foo") final String foo, @JsonProperty("value") final String value) {
            this.foo = checkNotNull(foo, "foo required");
            this.value = value;
        }

        @JsonProperty("foo")
        public void setFoo(final String foo) {
            this.foo = foo;
        }

        /**
         * @return String
         */
        public String foo() {
            return foo;
        }

    etc...

With the result from the given JSON response being 3 DTOs being initialised, rather than 4.

Community
  • 1
  • 1
tomaytotomato
  • 3,788
  • 16
  • 64
  • 119
  • http://stackoverflow.com/a/37722790/4969140 does this satisfy your needs? – Adrian Jałoszewski Sep 09 '16 at 13:16
  • 1
    This is just my opinion, but I don't think that this logic should be placed in a serialization library. Looking at the endpoint that is publishing the data, it will be unclear that entire chunks will be omitted transparently. – Jon Peterson Sep 09 '16 at 13:23
  • 1
    I bed to defer, @JonPeterson. There are at least two good reasons why this logic does belong to the serialization (receipt of input) phase: 1) this seems to me like classic input validation rule and you do want to detect and discard (or handle in some way) as soon as possible and 2) for performance considerations. the parser already iterated over the json array. your suggestion adds a 2nd iteration in memory for the sake of input validation – Sharon Ben Asher Sep 11 '16 at 05:43
  • This is for serialization not deserialization and thus appears to be for output, not input. So 1. this would be as-late-as-possible not as-early-as-possible and 2. iterating over the array prior to serialization would be roughly equivalent to doing in a serializer as the data is streamed in. Even if it was 1ms slower (would have to be a pretty big set of data), I would personally much rather have the logic obvious then be someone coming along later wondering why certain items aren't in the output. Then again I've never been a big AOP-all-the-things fan either. – Jon Peterson Sep 12 '16 at 12:23

1 Answers1

0

Maybe the below annotation on the class might help?

@JsonIgnoreProperties(ignoreUnknown = true)

HARDI
  • 394
  • 5
  • 12