0

I am using MongoDB as our data store, but we want to use Jackson for serialization/deserialization (the Mongo pojo classes don't handle nearly as many scenarios as Jackson - builders for example).

We have this working using a custom CodecProvider - here's the codec itself:

class JacksonCodec<T> implements Codec<T> {


   private final ObjectMapper objectMapper;
    private final Codec<RawBsonDocument> rawBsonDocumentCodec;
    private final Class<T> type;

    public JacksonCodec(ObjectMapper objectMapper,
                        CodecRegistry codecRegistry,
                        Class<T> type) {
        this.objectMapper = objectMapper;
        this.rawBsonDocumentCodec = codecRegistry.get(RawBsonDocument.class);
        this.type = type;
    }

    @Override
    public T decode(BsonReader reader, DecoderContext decoderContext) {
        try {

            RawBsonDocument document = rawBsonDocumentCodec.decode(reader, decoderContext);
            String json = document.toJson();
            return objectMapper.readValue(json, type);
        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }
    }

    @Override
    public void encode(BsonWriter writer, Object value, EncoderContext encoderContext) {
        try {

            String json = objectMapper.writeValueAsString(value);

            rawBsonDocumentCodec.encode(writer, RawBsonDocument.parse(json), encoderContext);

        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }
    }

    @Override
    public Class<T> getEncoderClass() {
        return this.type;
    }
}

this works fine, until we retrieve a document from Mongo that has a long that is greater than Integer.MAXVALUE. When that happens, deserialization fails with the following message:

Caused by: com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of long out of START_OBJECT token.

Looking at the bson, here's how the Mongo data is coming back to us:

"dateStamp" : { "$numberLong" : "1514334498165" }

so... I'm thinking that I need to register an additional deserializer for Jackson to handle this case (check for a token type of ID_START_OBJECT, parse if it's there, otherwise delegate to the built-in deserializer). I tried registering a simple Long deserializer with the ObjectMapper SimpleModule:

public class BsonLongDeserializer  extends JsonDeserializer<Long>{

    @Override
    public Class<Long> handledType() {
        return Long.class;
    }

    @Override
    public Long deserialize(JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
        if (p.currentTokenId() != JsonTokenId.ID_START_OBJECT){
            // have to figure out how to do this for real if we can get the deserilizer to actually get called
            return ctxt.readValue(p, Long.class);
        }
        return null;
    }
}

and register it:

private static ObjectMapper createMapper(){
    SimpleModule module = new SimpleModule();
    module.addDeserializer(Long.class, new BsonLongDeserializer());

    ObjectMapper mapper = new ObjectMapper()
    .configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
    .registerModule(module);

    return mapper;
}

but the BsonLongDeserializer never gets called by Jackson (are primitives handled differently and short-circuit the registered deserializers maybe?).

Jackson version 2.9.3. MongoDB driver version 3.6.

If anyone has any suggestions on angles to attack this, I would appreciate hearing them.

Referenced articles that don't seem to help: MongoDB "NumberLong/$numberLong" issue while converting back to Java Object

Kevin Day
  • 16,067
  • 8
  • 44
  • 68

1 Answers1

2

I got it working by fixing the Mongo side of things by creating a JsonWriterSettings object to suppress the weird json deserialization. This came from here: converting Document objects in MongoDB 3 to POJOS

The codec now looks like this:

class JacksonCodec<T> implements Codec<T> {
    private final ObjectMapper objectMapper;
    private final Codec<BsonDocument> rawBsonDocumentCodec;
    private final Class<T> type;

    public JacksonCodec(ObjectMapper objectMapper,
                        CodecRegistry codecRegistry,
                        Class<T> type) {
        this.objectMapper = objectMapper;
        this.rawBsonDocumentCodec = codecRegistry.get(BsonDocument.class);
        this.type = type;
    }

    @Override
    public T decode(BsonReader reader, DecoderContext decoderContext) {
        try {
            //https://stackoverflow.com/questions/35209839/converting-document-objects-in-mongodb-3-to-pojos
            JsonWriterSettings settings = JsonWriterSettings.builder().int64Converter((value, writer) -> writer.writeNumber(value.toString())).build();

            BsonDocument document = rawBsonDocumentCodec.decode(reader, decoderContext);
            String json = document.toJson(settings);
            return objectMapper.readValue(json, type);
        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }
    }

    @Override
    public void encode(BsonWriter writer, Object value, EncoderContext encoderContext) {
        try {

            String json = objectMapper.writeValueAsString(value);

            rawBsonDocumentCodec.encode(writer, RawBsonDocument.parse(json), encoderContext);

        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }
    }

    @Override
    public Class<T> getEncoderClass() {
        return this.type;
    }
}
Kevin Day
  • 16,067
  • 8
  • 44
  • 68
  • Just curious, you seem to be using Jackson's ObjectMapper, wouldn't ObjectMapper objectMapper = new ObjectMapper( new BsonFactory() ) have done the trick? – Greg S Oct 13 '20 at 02:15
  • It's been several years, so my memory here is pretty hazy. The problem at the time (which may very well be resolved now) was that BsonFactory really didn't play well with the mongo codec concept. At this point, I really can't remember why, but I know I put a crazy amount of time into trying to get this to work. At the end of the day, the approach that I had to use was really not efficient (lots of String transformations). I still can't figure out why Mongo felt the need to roll their own json factory... – Kevin Day Oct 14 '20 at 03:44
  • Kevin Day - Sorry. I looked at the date after I posted the comment. I know that feeling very of putting in crazy time for something that seems trivial. I have been trying to just transform bson into a json string. I got it working after I read this post. – Greg S Oct 15 '20 at 02:56