0

I have the following log entry:

2017-08-29 01:10:11.111 [http-noo-111-exe-1] TRACE com.javasystemsolutions.xml.gateway.Actions - The XML Gateway encountered an error. The message was Server with id OPA is not configured.

The template in use was TEST_Create_Incident_elkmonitoring.

The server in use was OPA.

The input XML was <incident>
       <summary>Test Monitoring - Summary</summary>
       <notes>Test Monitoring - Summary</notes>
       <product>ELK FAQ</product> </incident> com.javasystemsolutions.xml.gateway.ServerNotFoundException: Server with id OPA is not configured
       at com.javasystemsolutions.xml.gateway.input.PostActions.doPost(PostActions.java:215) [jss-xmlgateway.jar:?]
       at com.javasystemsolutions.xml.gateway.input.PostActions.postAction(PostActions.java:86) [jss-xmlgateway.jar:?]

What I 'm trying to do, is to use regex and identify the text between the incident tags, but as it seems something is wrong although my regular expression works on regex101 website and the configtest returns Configuration OK. My config is the one below, does someone have an idea of what is wrong?

# The # character at the beginning of a line indicates a comment. Use
# comments to describe your configuration.
input {
    file {
        type => "logs"
        path => "C:/logs/*.log"
        add_field => [ "Application", "ELK_GW_Test" ]
        add_field => [ "secret", "1234" ]
        start_position => beginning

        codec => multiline {
            pattern => "(^%{TIMESTAMP_ISO8601})"
            #negate => true
            what => "previous"
        }
    }
}
filter {
    #multiline {
      #pattern => "(^%{TIMESTAMP_ISO8601})"
      #negate => true
      #what => "previous"
    #}
    #if "_grokparsefailure" in [tags] {
      #drop { }
    #}
    if [host] == "host1" {
        grok {
            match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE} %{LOGLEVEL:Severity} %{GREEDYDATA:log_message}"}
        }
        grok {
        match => {"message" => "<incident>(?<incident>[\s\S]*)</incident>"}
    }
    }
}
output {
    tcp {
        host => "host1.com"
        port => 1234
        codec => "json_lines"
    }
    #if  "The message was Server with id " in [log_message]  {
    #email {
            #from => "log-mailer@XYZ.com"
            #subject => "Central logstash alert"
            #to => "my_email@ABC.com"
            #via => "smtp"
            #body => "The incident details are: %{incident} \nLog file: %{path}"
            #options => {
                #starttls => "true"
                #smtpIporHost => "email.XYZ.com"
                #port => "587"
                #userName => "log-mailer@XYZ.com" 
                # email-server-mail-id
                # password => "password"
                #authenticationType => "LOGIN"
            #}
        #}
    #}
}
baudsp
  • 4,076
  • 1
  • 17
  • 35
Fotis E.
  • 23
  • 1
  • 11

1 Answers1

1

This part of the configuration is wrong:

    grok {
        match => ["requested_incident", "(?s)<incident>.+?</incident>"]
    }

Try this instead:

    grok {
        match => {"message" => "<incident>(?<incident>[\s\S]*)</incident>"}
    }

I've used a custom pattern, which will search in the message field. What is found will go in a field called incident.

baudsp
  • 4,076
  • 1
  • 17
  • 35
  • thanks for your help, as it seems now there's no problem on my config, but for an unknown reason, nothing is parsed on my index – Fotis E. Sep 07 '17 at 12:27
  • The problem in your config is that the field `requested_incident` does not exist, so when you try to match this field with your pattern, nothing happens. In addition, the pattern you specify (`(?s).+?`) does not create a new field with the captured value. That's why I used a [custom pattern](https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#_custom_patterns) in my answer. – baudsp Sep 07 '17 at 13:57
  • Your configuration is valid, as in logstash will run and process events, but is wrong as it does not do what you intended. – baudsp Sep 07 '17 at 13:59
  • I added the full config above (please ignore the email alert part, since I haven't worked on it yet). So as you can see, I added your regex and I don't get any error during --configtest, but no data is parsed to my index – Fotis E. Sep 07 '17 at 14:26
  • I added the multiline codec in the input section because I was getting the following error message "Defaulting pipeline worker threads to 1 because there are some filters that might not work with multiple worker threads" – Fotis E. Sep 07 '17 at 14:31
  • Thank you. From your configuration, there might be two problem: a), if `[host] == "host1"` is not true, the logs won't be processed by the grok filter. b) the file input plugin read each file only once, so if the files have not changed, you have to [force logstash to reparse the files](https://stackoverflow.com/questions/19546900/how-to-force-logstash-to-reparse-a-file) – baudsp Sep 07 '17 at 14:44
  • In addition, I find useful to use a `stdout { codec => json }` output, so that to avoid having to check in elasticsearch when debugging a configuration. – baudsp Sep 07 '17 at 14:47
  • As it seems, some hnadlers are missing from our tomcat server, that is why the parsing doesn't work. The error i get is the following: Can't load log handler "1catalina.org.apache.juli.FileHandler" java.lang.ClassNotFoundException: 1catalina.org.apache.juli.FileHandler – Fotis E. Sep 08 '17 at 12:36
  • I don't understand your last comment and how it relates with your logstash configuration. If the problem is not with logstash, you'll have to create another question. – baudsp Sep 08 '17 at 13:32
  • you are right, it is not related, i just shared with you why I couldn't run my config. In any case and for future reference, I fixed this by using the latest logstash version. Thanks again for your help! :) – Fotis E. Sep 08 '17 at 14:18