I have a small Spring-Boot-based prototype to publish messages to a Kafka cluster using Protobuf. I'm using the confluent serializer:
io.confluent.kafka.serializers.protobuf.KafkaProtobufSerializer
io.confluent.kafka.serializers.protobuf.KafkaProtobufDeserializer
I'm also running the Schema Registry from Confluent (latest version) to handle the Protobuf schemas. Everything works as expected.
Now, I would like to introduce the Cloudevents spec (https://github.com/cloudevents/spec), but I'm struggling to understand how it can work with the Confluent Schema Registry.
Cloudevents has an sdk module to serialize a message directly to Protobuf. The data
section of the message is where my versioned payload should go, but there is no way to define a schema only for a section of the message. To be more clear:
CloudEvent event = CloudEventBuilder.v1()
.withId(UUID.randomUUID().toString())
.withType("example.vertx")
.withSource(URI.create("http://localhost"))
.withData(???) <-- HERE IS WHERE MY PAYLOAD SHOULD BE VERSIONED
.build();
One solution is to replicate the Cloudevent protobuf schema and simply define the message specification in each protobuf schema file. This has the disadvantage that I have to copy/paste the Cloudevents protobuf schema for each new message. This will allow me to use the standard Protobuf Kafka serde without using any Cloudevent library. Is there a better solution?