0

if I send the largest Json to the Kafka server it will show this kind of error, How can I increase message.max.bytes=15728640 and replica.fetch.max.bytes=15728640 in Kafka. I tried to increase byte level as below it won't work

The send buffer (SO_SNDBUF) used by the socket server

socket.send.buffer.bytes=15728640

The receive buffer (SO_RCVBUF) used by the socket server

socket.receive.buffer.bytes=15728640

Error:=>

[2022-01-06 12:36:51,281] [9015] [ERROR] [^-App]: Crashed reason=ProducerSendError("Error while sending: MessageSizeTooLargeError('The message is 6677420 bytes when serialized which is larger than the maximum request size you have configured with the max_request_size configuration',)",) 
Traceback (most recent call last):
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/faust/transport/drivers/aiokafka.py", line 1059, in send
    transactional_id=transactional_id,
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/aiokafka/producer/producer.py", line 310, in send
    key_bytes, value_bytes = self._serialize(topic, key, value)
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/aiokafka/producer/producer.py", line 231, in _serialize
    " max_request_size configuration" % message_size)
kafka.errors.MessageSizeTooLargeError: [Error 10] MessageSizeTooLargeError: The message is 6677420 bytes when serialized which is larger than the maximum request size you have configured with the max_request_size configuration

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/mode/services.py", line 779, in _execute_task
    await task
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/faust/app/base.py", line 941, in _wrapped
    return await task()
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/faust/app/base.py", line 991, in around_timer
    await fun(*args)
  File "/home/twilightuser/faust_library/producer.py", line 14, in my_send
    await topic.send(value=value)
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/faust/topics.py", line 193, in send
    callback=callback,
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/faust/channels.py", line 303, in _send_now
    schema, key_serializer, value_serializer, callback))
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/faust/topics.py", line 417, in publish_message
    headers=headers,
  File "/home/twilightuser/faust_library/venv/lib/python3.6/site-packages/faust/transport/drivers/aiokafka.py", line 1062, in send
    raise ProducerSendError(f'Error while sending: {exc!r}') from exc
faust.exceptions.ProducerSendError: Error while sending: MessageSizeTooLargeError('The message is 6677420 bytes when serialized which is larger than the maximum request size you have configured with the max_request_size configuration',)
  • I think you can find answer from https://stackoverflow.com/a/21343878/17823883. – Ice Griffin Jan 06 '22 at 07:41
  • 1
    Kafka's typically not designed to work with huge messages. It can, but it's not common. What kind of payload are you using Kafka for? Is it one logical record or should it be split up into separate messages? – Robin Moffatt Jan 06 '22 at 10:01
  • I need to send bulk JSON in a single file above 1MB.@RobinMoffatt – Ravi Devarasu Jan 06 '22 at 10:57
  • You need to change client configs too, not just Kafka. But why can't you upload this file to some blob storage like S3 (or MinIO, if self hosting), or NFS, then send that URI via Kafka as just a file _address_. Then it's up to consumers to download this remote file rather than deserialize the whole thing from Kafka? If you really need to configure your client, Faust uses aiokafka module, so look at its documentation on setting `max_request_size` – OneCricketeer Jan 06 '22 at 14:07

0 Answers0