0

Followed this Elastic link: Parsing your first logs with logstash

works great and I can use discover to see the Apache logs in Kibana. Yay!

So now, I have my Palo FW logs and have adjusted the Filebeat file from the tutorial as such:

paths:
- /var/log/logstash/logstash-tutorial.log
- /var/log/PaloaltoFW/PA-220-Dev.log

And I get an ERROR in Kibana on Index Patterns for filebeat-*, which rectifies if I switch to the one from the tutorial logstash-%DATE-00001 as default !

pipelines.yml

- pipeline.id: sample
path.config: "/etc/logstash/conf.d/first-pipeline.conf"

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/PaloFW-pipeline.conf"

- pipeline.id: rsyslog
  path.config: "/etc/logstash/conf.d/rsyslog.conf"

Also I do remove the registry files from filebeat.

My PaloFW-pipeline.conf:

# The # character at the beginning of a line indicates a comment. Use
# comments to describe your configuration.
input {
beats {
port => "5044"
type => "syslog"
}
}
# The filter part of this file is commented out to indicate that it is
# optional.
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
kv {}
}
output {
elasticsearch {
hosts => [ "10.10.69.20:9200" ]
}
}

I can now see the logs in Kibana, but had to switch to my default Index and the filebeat-* still gets the error at the beginning of post. Ideally I want the parse the tutorial.log and my firewall.log on the logstash server and have them show up as seperate indicies and index patterns...once I do that I will move on to other use cases.

So they load into the index from the tutorial logstash-%DATE%-0001 and I can see them, my filters could be better. But I need to sort out how indexes work and if I can use filebeat for more than 1 .log and create more than 1 index.

Many thanks community for any clarification!

TheftAuto
  • 3
  • 4
  • See https://stackoverflow.com/a/20562031/6113627 – baudsp Sep 11 '19 at 08:57
  • But where does identify the automic index creation for that log file being processed? 'input { file { type => "technical" path => "/home/technical/log" } file { type => "business" path => "/home/business/log" } } filter { if [type] == "technical" { # processing ....... } if [type] == "business" { # processing ....... } } output { if [type] == "technical" { # output to gelf } if [type] == "business" { # output to elasticsearch ' – TheftAuto Sep 11 '19 at 14:36
  • By default, the elasticsearch logstash output plugin uses the `logstash-%{+YYYY.MM.dd}` index (so one index per day) (from the [doc](https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-index)) – baudsp Sep 11 '19 at 16:21

0 Answers0