6

So I had this idea that's way over my head really since I've only been programming for a short while but I wanted to build a reactive Spring webflux app that exposes json endpoints to a react frontend.

The problem started when I decided to use the jsonb format in Postgres because I thought I might use json all the way from the DB up until the frontend layer.

When I try to SELECT the table with jsonb using the reactive R2dbc drivers I get the following error:

Caused by: java.lang.IllegalArgumentException: 3802 is not a valid object id

I have a table in postgres that looks like this:

Column  |  Type   | Collation | Nullable |           Default
---------+---------+-----------+----------+------------------------------
 id      | integer |           | not null | generated always as identity
 details | jsonb   |           |          |
Indexes:
    "snacks_new_pkey" PRIMARY KEY, btree (id)

So if I extract this as text to Spring webflux it works fine since it's no longer json.

"SELECT id, details->>'name' as NAME, details->>'price' AS PRICE, details->>'quantity' AS QUANTITY FROM snacks_new"

I've seen some examples on how to convert jsonb to an json object using the older blocking drivers but I can't get the same to work with the newer non-blocking ones, I can't access them in any way.

So I have 2 questions really, how can I SELECT a table that contains jsonb using reactive drivers, and am I wasting my time trying to do this, is extracting the json as text and creating a normal POJO from it enough?

Thanks for your time!

Xenofono
  • 169
  • 2
  • 9

3 Answers3

8

Update: Please upgrade to R2DBC Postgres 0.8.0.RC1.

The driver added recently support for JSON and JSONB types. You can consume JSON either as String, byte[] or io.r2dbc.postgresql.codec.Json type:

// Read as Json
connection.createStatement("SELECT my_json FROM my_table")
        .execute()
        .flatMap(it -> it.map((row, rowMetadata) -> row.get("my_json", Json.class)))
        .map(Json::asString)

// Read as String
connection.createStatement("SELECT my_json FROM my_table")
        .execute()
        .flatMap(it -> it.map((row, rowMetadata) -> row.get("my_json", String.class)))

// Write JSON
connection.createStatement("INSERT INTO my_table (my_json) VALUES($1)")
        .bind("$1", Json.of("{\"hello\": \"world\"}"))
        .execute()

// Write JSON as String using ::JSON casting
connection.createStatement("INSERT INTO my_table (my_json) VALUES($1::JSON)")
        .bind("$1", "{\"hello\": \"world\"}")
        .execute()

Please note that when you want to bind JSON values for a SELECT, INSERT or UPDATE, then you must either use the driver Json type or cast the binding value with $1::JSON to JSON.

You can also provide your own JsonCodec implementation leveraging the driver's CodecRegistrar if you e.g. want to use GSON or Jackson to map serialize/deserialize values on the driver level.

References:

mp911de
  • 17,546
  • 2
  • 55
  • 95
6

As explained by @mp911de, R2DBC now includes a Codec for Json. But defining your entity attributes as String, byte[] is not very comfortable and type io.r2dbc.postgresql.codec.Json is not very portable. Moreover you'll need to define a custom serializer/deserializer if you want to use it in a REST API as explained in the references.

One better option is to define a custom converter io.r2dbc.postgresql.codec.Json from/to JsonNode and use this later type for your attributes:

@Configuration
public class ReactivePostgresConfig {

private final ObjectMapper objectMapper;

public ReactivePostgresConfig(ObjectMapper objectMapper) {
    this.objectMapper = objectMapper;
}

@Bean
public R2dbcCustomConversions customConversions() {
    List<Converter<?, ?>> converters = new ArrayList<>();
    converters.add(new JsonToJsonNodeConverter(objectMapper));
    converters.add(new JsonNodeToJsonConverter(objectMapper));
    return R2dbcCustomConversions.of(PostgresDialect.INSTANCE, converters);
}

@ReadingConverter
static class JsonToJsonNodeConverter implements Converter<Json, JsonNode> {
    
    private final ObjectMapper objectMapper;

    public JsonToJsonNodeConverter(ObjectMapper objectMapper) {
        this.objectMapper = objectMapper;
    }
    
    @Override
    public JsonNode convert(Json json) {
        try {
            return objectMapper.readTree(json.asString());
        } catch (IOException e) {
            LOG.error("Problem while parsing JSON: {}", json, e);
        }
        return objectMapper.createObjectNode();
    }
}

@WritingConverter
static class JsonNodeToJsonConverter implements Converter<JsonNode, Json> {

    private final ObjectMapper objectMapper;

    public JsonNodeToJsonConverter(ObjectMapper objectMapper) {
        this.objectMapper = objectMapper;
    }
    
    @Override
    public Json convert(JsonNode source) {
        try {
            return Json.of(objectMapper.writeValueAsString(source));
        } catch (JsonProcessingException e) {
            LOG.error("Error occurred while serializing map to JSON: {}", source, e);
        }
        return Json.of("");
    }
}
}

References:

Aníbal
  • 785
  • 8
  • 24
  • Thanks for the extension, I have a question. Do these converters are only relevant to either native invocations or have integration with Spring Data R2DBC too? – peterzinho16 Apr 14 '22 at 16:53
  • 1
    I don't understand very well the question. Those are Spring Data converters used automatically by spring-data to map between R2DBC types and DTO attributes. – Aníbal Apr 14 '22 at 18:57
  • Sorry, my bad, I was defining bad the R2dbcCustomConversions – peterzinho16 Apr 14 '22 at 21:44
1

Update:

As of time of writing (sept 15, 2019) they did not support JSON, but now they do support JSON from version 0.8.0 and beyond

Old answer:

Im sorry to say that i think you are, as you put it, wasting your time.

when looking at the github for the R2DBC drivers you can see in their table, of what they support, that json is as of now, not supported.

Toerktumlare
  • 12,548
  • 3
  • 35
  • 54