We are using Kafka streams state store in the project, and we want to store more than 1MB of data, but we got below exception:
The message is 1760923 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
Then I followed the link Add prefix to StreamsConfig to enable setting default internal topic configs and added the following config:
topic.max.request.size=50000000
Then application works fine, and it can works fine when state store internal topic had been created but when Kafka been restarted and the state store topic had been lost/delete, then the Kafka stream processor need to create the internal state store topic automatically when start the application, at that moment, it throw exception which says:
"Aorg.apache.kafka.streams.errors.StreamsException: Could not create topic data-msg-seq-state-store-changelog. at org.apache.kafka.streams.processor.internals.InternalTopicManager.makeReady(InternalTopicManager.java:148)....
.....
org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:805) at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:774) Caused by: org.apache.kafka.common.errors.InvalidConfigurationException: Unknown topic config name: max.request.size".
The solution is we can manually create the internal topic, but that should not be good one.
Can you help me on this issue? If there is any config I have missed?
Thanks very much.
17 June 2020 update: still not resolve the issue. anyone can help?