3

I'm very new to kafka and confluent. I wrote a Producer nearly identical to the tutorial on https://www.confluent.fr/blog/schema-registry-avro-in-spring-boot-application-tutorial/ with my own dummy model. The application.yaml is the same as well. When I send the message to ccloud - the messages being received is gibberish enter image description here

Any idea as to how to fix this? When I do a System.out.println of the avro POJO before sending to kafka, the object looks good with all the proper values.

{
  "locationId": 1, 
  "time": 1575950400, 
  "temperature": 9.45, 
  "summary": "Overcast", 
  "icon": "cloudy", 
  "precipitationProbability": 0.24,
  ...

Whereas when I download the message from ccloud, the value looks like this

[ 
 {
   "topic":"Weather",
   "partition":0,
   "offset":14,
   "timestamp":1576008230509,
   "timestampType":"CREATE_TIME",
   "headers":[],
   "key":"dummyKey",
   "value":"\u0000\u0000\u0001��\u0002\u0002����\u000b\u0002fffff�\"@\
   ...
}
Robin Moffatt
  • 30,382
  • 3
  • 65
  • 92
Prady
  • 165
  • 1
  • 15

1 Answers1

3

You're actually doing everything right :) What you're hitting is just a current limitation in the Confluent Cloud GUI in rendering Avro messages.

If you consume the message as Avro you'll see that everything is fine. Here's an example of consuming the message from Confluent Cloud using kafkacat:

$ source .env
$ docker run --rm edenhill/kafkacat:1.5.0 \
          -X security.protocol=SASL_SSL -X sasl.mechanisms=PLAIN \
          -X ssl.ca.location=./etc/ssl/cert.pem -X api.version.request=true \
          -b ${CCLOUD_BROKER_HOST}:9092 \
          -X sasl.username="${CCLOUD_API_KEY}" \
          -X sasl.password="${CCLOUD_API_SECRET}" \
          -r https://"${CCLOUD_SCHEMA_REGISTRY_API_KEY}":"${CCLOUD_SCHEMA_REGISTRY_API_SECRET}"@${CCLOUD_SCHEMA_REGISTRY_HOST} \
          -s avro \
          -t mssql-04-mssql.dbo.ORDERS \
          -f '"'"'Topic %t[%p], offset: %o (Time: %T)\nHeaders: %h\nKey: %k\nPayload (%S bytes): %s\n'"'"' \
          -C -o beginning -c1


Topic mssql-04-mssql.dbo.ORDERS[2], offset: 110 (Time: 1576056196725)
Headers:
Key:
Payload (53 bytes): {"order_id": {"int": 1345}, "customer_id": {"int": 11}, "order_ts": {"int": 18244}, "order_total_usd": {"double": 2.4399999999999999}, "item": {"string": "Bread - Corn Muffaleta Onion"}}

This is the same topic shown here, with the binary Avro value field:

enter image description here

Robin Moffatt
  • 30,382
  • 3
  • 65
  • 92
  • Thank you very much. Does this mean that if I use the same Deserializer and make a Consumer, I should be able to see the messages properly on the Consumer side? – Prady Dec 11 '19 at 13:14
  • while making the consumer, how do we find the group_id? or set it? I find that while consuming the messages, I still get the same weird looking message – Prady Dec 11 '19 at 14:01
  • oops, the messages are consumed fine, I had a small typo. Ignore the last part – Prady Dec 11 '19 at 14:07
  • @Robin Moffat has this been fixed yet? I am running into the same issue when using Confluent Cloud and was wondering why this is happening? It does not happen for me when my application produces to a local docker cluster of Confluent Platform 6.0. – Twist Nov 16 '20 at 19:14