I have a logstash process running which is consuming data from a kafka topic. Messages in kafka topic are already in json format. logstash is simply pushing them in elastic search. But while doing so logstash changes ordering of the fields. There is a team which is consuming csv format of the data, so the changed ordering gives them trouble. What could be the reason?
for e.g input json {"foo1":"bar1","foo2":"bar2"}. logstash pushes it in elastic then in elastic it looks like {"foo2":"bar2","foo1":"bar1"}
logstash config
input{
kafka{
codec=>'json' bootstrap_servers => [localhost:9092] topics => 'sample-logs' auto_offset_reset => 'earliest' => group_id => 'logstash-consumer'
}
}
output {
elasticsearch {
hosts => "localhost:9200", codec => json index=> "sample-logs-es" }
stdout {
codec => rubydebug
}