I am writing logs into log file from my Django app, from there I am shipping those logs to elasticsearch. Because I want to split the fields as well, I am using logstash between filebeat and elasticsearch.
Here is sample log field:
2019-03-19 13:39:06 logfile INFO save_data {'field1': None, 'time': '13:39:06', 'mobile': '9876543210', 'list_item': "[{'item1': 10, 'item2': 'path/to/file'}]", 'response': '{some_complicated_json}}', 'field2': 'some data', 'date': '19-03-2019', 'field3': 'some other data'}
I tried to write a GROK match pattern but all the fields are going into message
field :
%{TIMESTAMP_ISO8601:temp_date}%{SPACE} %{WORD:logfile} %{LOGLEVEL:level} %{WORD:save_data} %{GREEDYDATA:message}
How can I write GROK match pattern which can decompose above log entry.