1

Somehow it is possible to define a geo-point field type in the Elasticsearch mapping and to import data, but not both. In the JSON data the location fields look like this

"location": { 
  "lat": 41.12,
  "lng": -71.34
}

Because we need "lon" instead of "lng" we use this "mutate" filter in the Logstash configuration to rename the field:

mutate {
    rename => {
        "[location][lng]" => "[location][lon]"
    }
}

If we do not use a mapping then Elasticsearch uses the following mapping automatically for location fields and imports the data

"location": {
  "properties": {
    "lat": {
      "type": "float"
    },
    "lon": {
      "type": "float"
    }
  }
}

So far so good. But if I use “geo_point” now in the Elasticsearch mapping when I create the index then I can not import any data anymore because I get the error message “can’t merge a non object mapping [location] with an object mapping” in Logstash which can happen if we try to change a mapping. But here the mapping was already used to create the index:

"mappings":{
    "properties":{
        "location": {
            "type": "geo_point",
            "ignore_malformed": "true",
        },
    }
}

Apparently Logstash and Elasticsearch consider the field location which has the type geo_point in the mapping as something which is not an object, while the JSON data for this location is an object.

While it is not possible to import the data in Logstash using this mapping, I can save the document in the Kibana DEV Tools like this though

PUT geo-test/_doc/1
{
  "title": "Geo-point test",
  "location": { 
    "lat": 41.12,
    "lon": -71.34
  }
}

How is it possible to import the data in Logstash using a geo-point mapping? (I am using Elasticsearch version 7.9.1 and Logstash version 7.12.0 including S3 Input plugin and Elasticsearch Output plugin)

0x4a6f4672
  • 27,297
  • 17
  • 103
  • 140
  • Can you share the effective mapping of your `geo-test` index using `GET geo-test`? – Val Jun 23 '21 at 05:52
  • It is not possible to create an Elasticsearch index which has already the right mapping from the start? The mapping I used is listed in the code block which starts with ""mappings". – 0x4a6f4672 Jun 23 '21 at 05:55
  • What I'm interested in is the mapping that is **currently** in your index, not the one you think you've used while creating the index. Experience has shown that sometimes the mapping is not the one we believe it is. Call `GET geo-test` and please share what you get – Val Jun 23 '21 at 05:58
  • If I call `/geo-test/_mapping` I get this mapping. – 0x4a6f4672 Jun 23 '21 at 06:11
  • Please update your question with the mapping you get – Val Jun 23 '21 at 06:18
  • I forgot to mention a detail which seems to be important. Before the data is imported it is processed by a "mutate" operation in Logstash. Some "mutate" operations seem to change field types, and this is the point where it seems to fail. The problem could be that Logstash itself tries to change the mapping which has been created before. I have updated the question. – 0x4a6f4672 Jun 23 '21 at 09:21
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/234104/discussion-between-0x4a6f4672-and-val). – 0x4a6f4672 Jun 23 '21 at 09:26
  • At this point, I suggest that you show us 1) your Logstash configuration and 2) your index mapping (as already requested). Without that, it's impossible to draw any conclusions – Val Jun 23 '21 at 09:38
  • Please have a look in the chat for a detailed description of the configuration. Thanks. – 0x4a6f4672 Jun 23 '21 at 10:10
  • Thanks, can you also show what you get out of the `stdout` plugin? please use `codec => json` instead of `rubydebug` – Val Jun 23 '21 at 11:50

1 Answers1

2

The reason you get this error is because you've specified a different document_type in your elasticsearch output. Simply remove the emphasized line below and it will work the way you expect:

elasticsearch {
    hosts => [ "${ES_DOMAIN_NAME}:443" ]
    healthcheck_path => "/"
    ilm_enabled => false
    ssl => true
    index => "${ES_INDEX}"
    document_type => "json"         <--- remove this line
}

The mapping you've installed is meant for the default document type which is _doc not json.

Val
  • 207,596
  • 13
  • 358
  • 360