6

I am getting the following error

org.apache.kafka.common.errors.RecordTooLargeException: The message is 196773 bytes when serialized which is larger than the total memory buffer you have configured with the buffer.memory configuration"

But the buffer.memory in my producer config is 10485760

Not sure why? Thanks

Matthias J. Sax
  • 59,682
  • 7
  • 117
  • 137
Sankar
  • 101
  • 1
  • 2
  • 6
  • you need to configure couple of properties when your message size is more than 1 MB, 1 MB is default, please check this http://stackoverflow.com/questions/21020347/kafka-sending-a-15mb-message – Shankar Oct 13 '16 at 16:57

1 Answers1

5

I understand that your buffer.memory in producer config is more than the size of the message that you are producing. But, there are few configuration that you need to maintain in order to produce message of size more than 1 MB.

message.max.bytes - (per broker) this is the largest size of the message that can be received by the broker from a producer.
max.message.bytes - (per topic) this is the largest size of the message the broker will allow to be appended to the topic. (Defaults to broker's message.max.bytes.)

Please go through the below link for more details: How can I send large messages with Kafka (over 15MB)?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
sam.ban
  • 238
  • 3
  • 13