0

could you please help to write a script, m getting the following error while running this script

input {
  # We read from the "old" cluster
  elasticsearch {
    hosts => [ "localhost" ]
    port => "9200"
    index => "products"
    size => 500
    scroll => "5m"
    docinfo => true
  }
}

output {
  # We write to the "new" cluster
  elasticsearch {
    host => "localhost"
    port => "9200"
    protocol => "http"
    index => "%{[@metadata][_index1]}"
    index_type => "%{[@metadata][_type1]}"
    document_id => "%{[@metadata][_id]}"
  }
  # We print dots to see it in action
  stdout {
    codec => "dots"
  }
}

This is my logstash.conf file when I run this m getting the following error

Unknown setting 'port' for elasticsearch

{:level=>:error}
fetched an invalid config {:config=>"input {\n  # We read from the \"old\" cluster\n  elasticsearch {\n    hosts => [ \"localhost\" ]\n    port => \"9200\"\n    index => \"products\"\n    size => 500\n    scroll => \"5m\"\n    docinfo => true\n  }\n}\n\noutput {\n  # We write to the \"new\" cluster\n  elasticsearch {\n    host => \"localhost\"\n    port => \"9200\"\n    protocol => \"http\"\n    index => \"%{[@metadata][_index1]}\"\n    index_type => \"%{[@metadata][_type1]}\"\n    document_id => \"%{[@metadata][_id]}\"\n  }\n  # We print dots to see it in action\n  stdout {\n    codec => \"dots\"\n  }\
prasad kp
  • 833
  • 1
  • 10
  • 32

1 Answers1

0

I've modified the configuration, with the options of the 2+ version.

input {
  # We read from the "old" cluster
  elasticsearch {
    hosts => [ "localhost:9200" ]
    index => "products"
    size => 500
    scroll => "5m"
    docinfo => true
  }
}

output {
  # We write to the "new" cluster
  elasticsearch {
    hosts => [ "localhost:9200" ]
    index => "%{[@metadata][_index1]}"
    document_type => "%{[@metadata][_type1]}"
    document_id => "%{[@metadata][_id]}"
  }
  # We print dots to see it in action
  stdout {
    codec => "dots"
  }
}
baudsp
  • 4,076
  • 1
  • 17
  • 35
  • Thanks Alot baudsp ,but sorry to truble you again m getting the unwanted fields like – prasad kp Dec 19 '16 at 13:15
  • @PrasadKhandagale Glad I could help. To go from mysql to ES, you can use the [jdbc input plugin](https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html). You can find a few question on SO about this (like http://stackoverflow.com/questions/37650623/elasticsearch-logstash-duplicating-output-when-use-schedule, http://stackoverflow.com/questions/31611171/logstash-jdbc-input-plugin-for-mysql, http://stackoverflow.com/questions/39073928/logstash-jdbc-mysql-config-error). If it does not work, ask a new question, I don't have any experience with this input plugin. – baudsp Dec 19 '16 at 13:29
  • Thanks Again bausap,it worked bu the mapping is not copied correctly in "properties": { "text_field": { "type": "string", "index_analyzer": "edge_ngram_analyzer", "search_analyzer": "standard" } } this is the orginal mapping but the copied one does not contain this index & search analyzer est_ktype": { "properties": { "text_field": { "type": "string" } } – prasad kp Dec 19 '16 at 13:32
  • You'll have to delete your index, create an index with the mapping and reindex. The mappings are not part of the data, so they are not carried over to the new index – baudsp Dec 19 '16 at 13:34