0

when putting data into Producer - Kafka - Logiste - Elastic Search. At this time, if the Kafka accumulates more than 20,000 data (if the consumer can not take it), all of a sudden it becomes read. In other words, if the LAG exceeds 20000, it equals CURRENT-OFFSETdl LOG-END-OFFSET.

I do not know why. I did not experience it before. You know what's wrong?

김태우
  • 1,033
  • 1
  • 12
  • 29

1 Answers1

0

You may want to check the message retention configuration of your Kafka cluster. Please refer to this answer or this documentation. Search for 'log.retention' in the Kafka documentation.

Sanju Thomas
  • 181
  • 2
  • 10