1

Source log sample from message field:

{"log":"2022/02/15 22:47:07 insert into public.logs (time, level, message, hostname, loggerUID, appmodule) values ('2022-02-15 22:47:07.494330952','ERROR','GetRequestsByUserv2 :pq: column \"rr.requestdate\" must appear in the GROUP BY clause or be used in an aggregate function','ef005e6da6f6','ba282127-6ef6-4238-9287-d7127a8d1996','eReturn')\n","stream":"stderr","time":"2022-02-15T14:47:07.495133571Z"}

Trying to extract " level: ERROR " as separate field from above log using ingest pipelines in Elastic so that it can be segregated based on the level of the logs such as ERROR,WARNING,INFO

Tried with split processor, but was not able to get the desired output. Any help would be appreciated.

Prasad
  • 35
  • 6

1 Answers1

1

You can use the grok processor using its syntax for regex:

%{DATA:preerror} values \('%{DATA:date}','%{DATA:error}'%{GREEDYDATA:posterror}

Then you can remove the fields preerror, date, posterror that you don't need.

dcolazin
  • 831
  • 1
  • 10
  • 25
  • thank you for the suggestion, I will go through grok – Prasad Feb 15 '22 at 16:26
  • I was able to get the loglevel with pattern ** values \('%{DATA}','%{DATA:LOGLEVEL}' ** The pattern works fine in Grok Debugger but fails to validate in ingest pipelines. – Prasad Feb 16 '22 at 17:18
  • What does the logstash log say? Did you try with my pattern? – dcolazin Feb 16 '22 at 17:42
  • 1
    Its not configured from logstash, I was testing it directly by feeding the data in Elastic cloud "create ingest pipeline" options. – Prasad Feb 28 '22 at 05:07