They are check point fire wall logs and they look like so.... (first row = fields, second row and all the rows thereafter = values of the respective fields
)
"Number" "Date" "Time" "Interface" "Origin" "Type" "Action" "Service" "Source Port" "Source" "Destination" "Protocol" "Rule" "Rule Name" "Current Rule Number" "Information"
"7319452" "18Mar2015" "15:00:00" "eth1-04" "grog1" "Log" "Accept" "domain-udp" "20616" "172.16.36.250" "8.8.8.8" "udp" "7" "" "7-open_1" "inzone: Internal; outzone: External; service_id: domain-udp" "Security Gateway/Management"
I have tried doing this bit by bit by getting some code online (grok filters).
I have a file that has nothing more than
"GoLpoT" "502"
(quotes included)
and some code that reads this file which is pasted below:
input {
file {
path => "/usr/local/bin/firewall_log"
}
}
filter {
grok {
match => ["message", "%{WORD:type}\|%{NUMBER:nums}"]
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
When I run the code, I get the following error
"message" => "",
"@version" => "1",
"@timestamp" => "2015-04-30T15:52:48.331Z",
"host" => "UOD-220076",
"path" => "/usr/local/bin/firewall_log",
"tags" => [
[0] "_grokparsefailure"
Any help please.
My second question - how do I parse the Date
and Time
- together or separately ?
The date doesn't change - it's all logs from one day - it's only the time that changes.
Many thanks.