0

I am wondering what the best approach to take with my Logstash Grok filters. I have some filters that are for specific log entries, and won't apply to all entries. The ones that don't apply always generate _grokparsefailure tags. For example, I have one grok filter that's for every log entry and it works fine.

Logstash parses other unwanted log lines which are not required.

I'd prefer to have it just pass the rule if there isn't a match instead of adding the parsefailure tag. I use the parsefailure tag to find things that aren't parsing properly, not things that simply didn't match a particular filter. Maybe it's just the nomenclature "parse failure" that gets me. To me that means there's something wrong with the filter (e.g. badly formatted), not that it didn't match.

James Z
  • 12,209
  • 10
  • 24
  • 44
HELLBOY
  • 11
  • 2
  • maybe use a conditional on the message (like [here](https://stackoverflow.com/questions/20849583/how-to-handle-non-matching-logstash-grok-filters)) to avoid having unwanted message go through the grok filter? – baudsp Aug 10 '21 at 13:12
  • Hi @baudsp, I have a GREEDYDATA field at the start of the logline and then followed by timestamp and rest. But the grok is parsing those logs also which do not have the first parameter which was defined as GREEDYDATA, and I do not want to keep those events in my elastic. – HELLBOY Aug 25 '21 at 05:26

1 Answers1

0

If you want some grok filters to add _grokparsefailure and not others then for the latter type use

tag_on_failure => []
Badger
  • 3,943
  • 2
  • 6
  • 17