I am trying to implement the solution described in the following answer:
https://stackoverflow.com/a/27867252/740839
But Elasticsearch throws back the following exception saying that it is unable to parse the @timestamp field:
[2015-01-30 12:09:39,513][DEBUG][action.bulk ] [perfgen 1] [logaggr-2015.01.30][2] failed to execute bulk item (index) index {[logaggr-2015.01.30][logs][c2s5PliTSGKmZSXUWzlkNw], source[{"message":"2015-01-29 17:30:31,579 [ERROR] [pool-1-thread-9] [LogGenerator] invocation count=813,time=2015-01-29 17:30:31,578,metric=-9080142057551045424","@version":"1","@timestamp":"2015-01-30T19:10:53.891Z","host":"perfdev","path":"/home/user/work/elk/logaggr-test/LogAggr_Test.log","logts":"2015-01-29 17:30:31,579","level":"ERROR","thread":"pool-1-thread-9","classname":"LogGenerator","details":"invocation count=813,time=2015-01-29 17:30:31,578,metric=-9080142057551045424"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [@timestamp]
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:414)
at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:648)
at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:501)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:542)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:491)
at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:376)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:451)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:157)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:535)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:434)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: org.elasticsearch.index.mapper.MapperParsingException: failed to parse date field [2015-01-30T19:10:53.891Z], tried both date format [yyyy-MM-dd HH:mm:ss,SSS], and timestamp number with locale []
at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:610)
at org.elasticsearch.index.mapper.core.DateFieldMapper.innerParseCreateField(DateFieldMapper.java:538)
at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:223)
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:404)
... 12 more
Caused by: java.lang.IllegalArgumentException: Invalid format: "2015-01-30T19:10:53.891Z" is malformed at "T19:10:53.891Z"
at org.elasticsearch.common.joda.time.format.DateTimeFormatter.parseMillis(DateTimeFormatter.java:754)
at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:604)
... 15 more
As seen in the "message", my log statement looks like this:
2015-01-29 17:30:31,579 [ERROR] [pool-1-thread-9] [LogGenerator] invocation count=813,time=2015-01-29 17:30:31,578,metric=-9080142057551045424
Not sure if the problem is with the logstash configuration. My logstash filter looks like this:
filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:logts}%{SPACE}\[%{LOGLEVEL:level}%{SPACE}]%{SPACE}\[%{DATA:thread}]%{SPACE}\[%{DATA:classname}]%{SPACE}%{GREEDYDATA:details}" ]
}
}
and my logstash output is:
output {
elasticsearch {
cluster => "perfgen"
host => "10.1.1.1"
port => 9201
index => "logaggr-%{+YYYY.MM.dd}"
protocol => "http"
template => "logaggr-test.json"
template_name => "logaggr"
}
}
and my template "logaggr-test.json" is:
{
"template": "logaggr-*",
"mappings": {
"logaggr": {
"date_detection": false,
"properties": {
"_timestamp": { "type": "date", "enabled": true, "store": true },
"logts": { "type": "date" },
"level": { "type": "string" },
"thread": { "type": "string" },
"classname": { "type": "string" },
"details": { "type": "string"}
}
}
}
}
I have tried adding a default mapping, etc., but I can't get past the parsing exception.
To reiterate the problem I am trying to solve, I am trying to setup logstash to parse my log file and index it into Elasticsearch. In the process, I want to capture the timestamp of my log message, @timestamp (added by logstash) and _timestamp (added by Elasticsearch)
Appreciate any help.