Kafka Version 1.1.0
I have a single node kafka broker with the following configs in config/server.properties:
Broker Configs:
message.max.bytes=100000000
max.message.bytes=100000000
replica.fetch.max.bytes=150000000
log.segment.bytes=1073741824 (Default)
Console consumer properties file has the following configs:
Consumer Properties:
receive.buffer.bytes=100000000
max.partition.fetch.bytes=100000000
fetch.max.bytes=52428800
I am producing a message whose size is about 20KB. I produce to a topic using console producer. Then start a console consumer on the topic and it doesn't consumes the complete message (cut in between).
I have looked into this post and have tried to set the same settings but it doesn't seem to work out.
What am I missing here? Kindly help me out.
UPDATE:
> echo | xargs --show-limits
Your environment variables take up 3891 bytes
POSIX upper limit on argument length (this system): 2091213
POSIX smallest allowable upper limit on argument length (all systems): 4096
Maximum length of command we could actually use: 2087322
Size of command buffer we are actually using: 131072
Maximum parallelism (--max-procs must be no greater): 2147483647
UPDATE 1:
I have tested another scenario. This time I am producing the same message using java producer instead of console producer and now when I consume I get the complete message.