0

I am currently working on an anomaly detection project based on elasticsearch and kibana. Recently I have converted csv file to json and tried to import this data to elasticsearch via Postman using Bulk API. Unfortunately all of the queries were wrong.

Then I have found this topic : Import/Index a JSON file into Elasticsearch

and tried following approach :

curl -XPOST 'http://localhost:9200/yahoodata/a4benchmark/4' --data-binary @Anomaly1.json

The answer I got :

{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse"}],"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"}},"status":400}

The data I am trying to insert has the following structure (Anomaly1.json):

[
  {
    "timestamps": 11,
    "value": 1,
    "anomaly": 1,
  },
  {
    "timestamps": 1112,
    "value": 211,
    "anomaly": 0,
  },
  {
    "timestamps": 2,
    "value": 1,
    "anomaly": 0,
  }
]
Community
  • 1
  • 1
Muzz
  • 1
  • 4
  • 1
    looks like its complaining that you dont specify the content type (you dont tell it that you're sending json), add `-H "Content-Type: application/json"` to the commandline and try again? – hanshenrik Jan 15 '17 at 23:52
  • Do you want to store each anomaly in the array as a standalone document or is it a big document with all anomalies into a single array? – Val Jan 16 '17 at 04:59
  • @Val It is a single document with all anomalies in a single array. In the example that I gave in my first post, there are only two readings but in the actual document there is around 1000 of them. – Muzz Jan 16 '17 at 08:16
  • @hanshenrik unfortunately it didn't help. I got the same error. I should put -H (...) just before I specify the file (meaining before --data-binary ) – Muzz Jan 16 '17 at 08:17

0 Answers0