0

i have a lot of JSON documents with this structure :

"positions": [
    {
      "millis": 12959023,
      "lat": 49.01525113731623,
      "lon": 2.4971945118159056,
      "rawX": -3754,
      "rawY": 605,
      "rawVx": 0,
      "rawVy": 0,
      "speed": 9.801029291617944,
      "accel": 0.09442740907572084,
      "grounded": true
    },
    {
      "millis": 12959914,
      "lat": 49.01536940596998,
      "lon": 2.4967825412750244,
      "rawX": -3784,
      "rawY": 619,
      "rawVx": -15,
      "rawVy": 7,
      "speed": 10.841861737855924,
      "accel": -0.09534648619563282,
      "grounded": true
    }
...
}

i'm trying to map this JSON document with Elasticsearch by introducing a geo_point field to get document like the one below :

"positions": [
        {
          "millis": 12959023,
          "location" : {
              "lat": 49.01525113731623,
              "lon": 2.4971945118159056,
           }          
          "rawX": -3754,
          "rawY": 605,
          "rawVx": 0,
          "rawVy": 0,
          "speed": 9.801029291617944,
          "accel": 0.09442740907572084,
          "grounded": true
        },
    ...
    }

PS : these documents are offered by an API.

Thanks

Community
  • 1
  • 1
Taybou
  • 3
  • 1
  • 2
  • How are you loading your documents into ES? Probably, Logstash would help in creating that `geo_point` field, or if you're using ES5 you can use a pipeline/processor to achieve the same thing. – Val Dec 08 '16 at 05:42

2 Answers2

2

You could do something like this:

curl -XPUT 'http://localhost:9200/<indexname>/positions/_mapping' -d @yourjsonfile.json

Hope this SO helps!

Community
  • 1
  • 1
Kulasangar
  • 9,046
  • 5
  • 51
  • 82
  • If you get an error: `"error":"Content-Type header [application/x-www-form-urlencoded] is not supported"` then adding `-H "Content-Type: application/json"` might help – TomasZ. Jul 06 '20 at 10:26
0

If you can't modify the source, you need to pre-process your document before it gets indexed into elasticsearch.

  • If you are using elasticsearch < 5.0, you can use logstash and the mutate filter.
  • If you are using elasticsearch >= 5.0 (my advice), use an ingest pipeline and the rename processor.
dadoonet
  • 14,109
  • 3
  • 42
  • 49