5

I am trying to save set of events from MySQL database into elastic search using jdbc input plugin with Logstash. The event record in database contains date fields which are in microseconds format. Practically, there are records in database between set of microseconds.

While importing data, Elasticsearch is truncating the microseconds date format into millisecond format. How could I save the data in microsecond format? The elasticsearch documentation says they follow the JODA time API to store date formats, which is not supporting the microseconds and truncating by adding a Z at the end of the timestamp.

Sample timestamp after truncation : 2018-05-02T08:13:29.268Z

Original timestamp in database : 2018-05-02T08:13:29.268482

BarathVutukuri
  • 1,265
  • 11
  • 23

2 Answers2

10

The Z is not a result of the truncation but the GMT timezone.

ES supports microseconds, too, provided you've specified the correct date format in your mapping.

If the date field in your mapping is specified like this:

    "date": {
      "type": "date",
      "format": "yyyy-MM-dd'T'HH:mm:ss.SSSSSS"
    }

Then you can index your dates with the exact microsecond precision as you have in your database

UPDATE

Here is a full re-creation that shows you that it works:

PUT myindex
{
  "mappings": {
    "doc": {
      "properties": {
        "date": {
          "type": "date",
          "format": "yyyy-MM-dd'T'HH:mm:ss.SSSSSS"
        }
      }
    }
  }
}

PUT myindex/doc/1
{
  "date": "2018-05-02T08:13:29.268482"
}
Val
  • 207,596
  • 13
  • 358
  • 360
  • I tried updating the mapping with the syntax you mentioned. But the update is failing saying mapper_parsing_exception. You can see the error log https://i.imgur.com/NkuxouL.png and https://i.imgur.com/p6kj5Wt.png – BarathVutukuri May 02 '18 at 09:45
  • 1
    Unfortunately, you cannot modify the mapping of a date field, you need to delete your index and recreate index, or create another index with the right mapping and reindex your current index into the new one with the correct mapping. – Val May 02 '18 at 09:54
  • I tried creating a new index and mapping with the field format as specified by you, still I am getting the same error mapper_parsing_exception. Looks like elastic doesn't support timestamps in microseconds format. – BarathVutukuri May 02 '18 at 10:13
  • Thanks. My bad, I was not adding the properties key and trying to create the index. – BarathVutukuri May 02 '18 at 13:13
  • Gotcha, no worries, glad we figured it out! – Val May 02 '18 at 13:14
2

Side note, "date" datatype stores data in milliseconds in elasticsearch so here in case nanoseconds precision level are wanted in date ranges queries; the appropriate datatype is date_nanos

JulienG
  • 88
  • 5