I am following this github repo
https://github.com/hannesstockner/kafka-connect-elasticsearch/
and I am trying to read data from file source into elastic search
I am getting an error when i run standalone.sh script
Failed to flush WorkerSourceTask{id=local-file-source-0}, timed out while waiting for producer to flush outstanding messages, 1 left ({ProducerRecord(topic=recipes, partition=null, key=null, value=[B@6704e57f=ProducerRecord(topic=recipes, partition=null, key=null, value=[B@6704e57f})
And these are my config:
connect-elasticsearch-sink.properties
name=local-elasticsearch-sink
connector.class=com.hannesstockner.connect.es.ElasticsearchSinkConnector
tasks.max=1
es.host=10.200.10.1
topics=recipes
index.prefix=kafka_
connect-file-source.properties
name=local-elasticsearch-sink
connector.class=com.hannesstockner.connect.es.ElasticsearchSinkConnector
tasks.max=1
es.host=10.200.10.1
topics=recipes
index.prefix=kafka_
connect-standalone.properties
bootstrap.servers=10.200.10.1:9092
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
#offset.flush.interval.ms=20000
offset.flush.timeout.ms=20000
and docker config:
kafka:
image: flozano/kafka:0.9.0.0
ports:
- "2181:2181"
- "9092:9092"
environment:
ADVERTISED_HOST: ${DOCKER_IP}
elasticsearch:
image: elasticsearch:2.1
ports:
- "9200:9200"
- "9300:9300"
I tried to set offset.flush.timeout.ms=20000 and producer.buffer.memory=10 in my standlone.properties file following thread but no luck:
Kafka Connect - Failed to flush, timed out while waiting for producer to flush outstanding messages