3

I am facing an issue while sending a huge json (~40MB) from Kafa producer to consumer.

I referred other similar posts on StackOverflow but still unable to succeed.

I set these parameters in my Producer and Consumer:

producer = KafkaProducer(bootstrap_servers = kafkaConf.bootstrap_servers, value_serializer = lambda v: json.dumps(v).encode('utf-8'),max_request_size=101626282, buffer_memory=101626282)

consumer = KafkaConsumer(value_deserializer=lambda m: json.loads(m.decode('utf-8')),fetch_max_bytes=101626282)

As you see I am allowing a max of ~100MB

I even compressed the data but still unsuccessful.

Could somebody help me in figuring out what else I need to configure?

I would really appreciate it.

Thanks!

David B.
  • 371
  • 5
  • 17
Aniruddh Khera
  • 111
  • 1
  • 13
  • You are getting this error in producer side or consumer side ? can you share all the relevant config of the respective one where it is failing – Liju John Mar 30 '18 at 05:08
  • 1
    can u check this once : https://stackoverflow.com/questions/21020347/how-can-i-send-large-messages-with-kafka-over-15mb – shakeel Mar 30 '18 at 12:17
  • @Shakeel, I referred the same post earlier, based on my understanding I added all the relevant config in the code snippet as shown above. Correct me if I am wrong. – Aniruddh Khera Mar 30 '18 at 19:33
  • @LijuJohn, it's failing on broker side. After dividing my huge json array into chunks, it's still complaining. Whereas the result hasn't reached to the consumer. Besides adding the configs (As shown above), for broker do I need to set config in kafka's server.properties file ? – Aniruddh Khera Mar 30 '18 at 20:07
  • @AniruddhKhera I had the same issue. As in... I didn't need to consume that much data, but, still. The answer to [this](https://stackoverflow.com/questions/49565386/kafka-python-messagesizetoolargeerror-even-after-increasing-max-request-size#comment86169423_49565386) is **Yes**. Please follow this [answer](https://stackoverflow.com/a/21343878/4039768). And put the mentioned property values in **server.properties** file. It worked for me. Thought, please let me know if that's an improper way to do. I'm open for the betterment. – Mohsin Aljiwala Sep 28 '18 at 19:12
  • You could use compression on producer side, in my case compression help me to send a huge json message. Check this response [here](https://stackoverflow.com/questions/51767879/not-able-to-send-large-messages-to-kafka) due even if a change max_request_size the producer didn't allow me to send the huge message. – ontananza Jul 15 '22 at 00:53

0 Answers0