the following is kafka publishing code which is giving RecordTooLargeException exception.
tried all possible solutions given in stackoverflow giving info about different properties like max.request.size etc. but nothing worked. exact stack trace is
Caused by: org.springframework.kafka.KafkaException: Send failed; nested exception is org.apache.kafka.common.errors.RecordTooLargeException: The message is 1696090 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration
.
@SuppressWarnings("unchecked")
@Override
public void run(String... args) throws Exception {
JSONArray array = new JSONArray();
for (int i = 0; i < 8000; i++) {
JSONObject object = new JSONObject();
object.put("no", 1);
object.put("name", "Kella Vivek");
object.put("salary", 1000);
object.put("address", "2-143");
object.put("city", "gpm");
object.put("pin", 534316);
object.put("dist", "west");
object.put("state", "ap");
object.put("username", "mff");
object.put("password", "mff");
array.add(object);
}
ObjectMapper mapper = new ObjectMapper();
String string = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(array);
template.send("consume", string);
}