I have an index with this settings and mappings.
PUT /amazon_products
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"title": {
"type": "text"
},
"description": {
"type": "text"
},
"manufacturer": {
"type": "text",
"fields": {
"raw": {
"type": "keyword"
}
}
},
"price": {
"type": "scaled_float",
"scaling_factor": 100
}
}
}
}
This fields also exist in my .csv file and i want to send my data from csv file to elasticsearch using logstash .
This is my logstash config file:
input {
file {
path => "E:\ElasticStack\Logstash\products.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
csv {
separator => ","
columns => ["id","title","description","manufacturer","price"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "amazon_products"
}
stdout {}
}
When if use this command .\logstash -f ..\config\logstash.conf
The only message from logstash is:
Successfully started Logstash API endpoint {:port=>9600} and it doesn't send data to elasticsearch
Please Help Me. Thank You :)