0

I am writing logs into log file from my Django app, from there I am shipping those logs to elasticsearch. Because I want to split the fields as well, I am using logstash between filebeat and elasticsearch.

Here is sample log field:

2019-03-19 13:39:06 logfile INFO save_data {'field1': None, 'time': '13:39:06', 'mobile': '9876543210', 'list_item': "[{'item1': 10, 'item2': 'path/to/file'}]", 'response': '{some_complicated_json}}', 'field2': 'some data', 'date': '19-03-2019', 'field3': 'some other data'}

I tried to write a GROK match pattern but all the fields are going into message field :

%{TIMESTAMP_ISO8601:temp_date}%{SPACE} %{WORD:logfile} %{LOGLEVEL:level} %{WORD:save_data} %{GREEDYDATA:message}  

How can I write GROK match pattern which can decompose above log entry.

sid8491
  • 6,622
  • 6
  • 38
  • 64

1 Answers1

0

I don't know how you could do this with Grok, but the way we do it is with a json processor in elastic ingest node pipeline. Something like this:

{
    "my-log-pipeline": {
        "description": "My log pipeline",
        "processors": [{
            "json": {
                "field": "message",
                "target_field": "messageFields"
            }
        }]
    }
}

Then you just need to tell your source (filebeat/logstash) to use this pipeline when ingesting.

Adam T
  • 1,481
  • 11
  • 20