1

They are check point fire wall logs and they look like so.... (first row = fields, second row and all the rows thereafter = values of the respective fields)

 "Number" "Date" "Time" "Interface" "Origin" "Type" "Action" "Service" "Source Port" "Source" "Destination" "Protocol" "Rule" "Rule Name" "Current Rule Number" "Information" 
"7319452" "18Mar2015" "15:00:00" "eth1-04" "grog1" "Log" "Accept" "domain-udp" "20616" "172.16.36.250" "8.8.8.8" "udp" "7" "" "7-open_1" "inzone: Internal; outzone: External; service_id: domain-udp" "Security Gateway/Management"

I have tried doing this bit by bit by getting some code online (grok filters). I have a file that has nothing more than "GoLpoT" "502" (quotes included)

and some code that reads this file which is pasted below:

input {
  file {
    path => "/usr/local/bin/firewall_log"
  }
}

filter {
  grok {
    match => ["message", "%{WORD:type}\|%{NUMBER:nums}"]
  }
}

output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

When I run the code, I get the following error

"message" => "",
      "@version" => "1",
    "@timestamp" => "2015-04-30T15:52:48.331Z",
          "host" => "UOD-220076",
          "path" => "/usr/local/bin/firewall_log",
          "tags" => [
        [0] "_grokparsefailure"

Any help please.

My second question - how do I parse the Date and Time - together or separately ? The date doesn't change - it's all logs from one day - it's only the time that changes.

Many thanks.

ptCoder
  • 2,229
  • 3
  • 24
  • 38
DannyKELK
  • 33
  • 2
  • 6
  • You say that your input file contains two quoted strings, but that's not what you're parsing with grok. – Alain Collins Apr 30 '15 at 17:14
  • Thanks collins but could you please explain what you mean ? My input file contains two strings, "GoLpoT" and "502" - and that is what I am attempting to parse. – DannyKELK Apr 30 '15 at 17:41
  • Your input is: "GoLpoT" "502" (two quoted strings separated by a space), but your grok pattern is: "%{WORD:type}\|%{NUMBER:nums}" (two pipe-separated values). Grok patterns need to match your input. – Alain Collins Apr 30 '15 at 17:46
  • Thx. still getting errors. Input file now "Chckpoint" "502" "10.189.7.138" "Allow" "18 Mar 2015 15:00:01" and new code input { file { path => "/usr/local/bin/firewall_log" } } filter { grok { match => ["message", ""%{WORD:type}" "%{NUMBER:nums}" "%{IP:sourceip}" "%{WORD:Action}""] add_tag => "checkpoint" } date { match => ["DATETIME", ""%{dd mmm yyyy hh:mm:ss}""] target => "@timestamp" } } output { elasticsearch { host => localhost } stdout { codec => rubydebug } } – DannyKELK Apr 30 '15 at 18:09
  • That pattern works for me with that input in the debugger. – Alain Collins Apr 30 '15 at 19:08
  • Collins, thanks very much for your help. I have decided to delete the commas and have the entries separated by spaces only and it's working now but I have a new problem though...I have posted it in the following link. Help, please. http://stackoverflow.com/questions/29975826/trouble-with-log-stash-timestamp – DannyKELK Apr 30 '15 at 19:15
  • I think worth checking this answer in regard to the current question: http://stackoverflow.com/questions/28957870/how-to-remove-all-fields-with-null-value-in-logstash-filter – st2rseeker Mar 24 '16 at 16:36

0 Answers0