-1

i been told that, by using logstash pipeline i can re-create a log format(i.e JSON) when entering into elasticsearch. but not understanding how to do it .

current LOGStash Configure ( I took bellow from Google , not for any perticular reason)

/etc/logstash/conf.d/metrics-pipeline.conf
input {
  beats {
    port => 5044
    client_inactivity_timeout => "3600"
  }
}




filter {
    if [message] =~ />/ {
        dissect {
            mapping => {
                "message" => "%{start_of_message}>%{content}"
            }
        }

        kv {
            source => "content"
            value_split => ":"
            field_split => ","
            trim_key => "\[\]"
            trim_value => "\[\]"
            target => "message"
        }

        mutate {
            remove_field => ["content","start_of_message"]
        }
    }
}



filter {
  if [system][process] {
    if [system][process][cmdline] {
      grok {
        match => {
          "[system][process][cmdline]" => "^%{PATH:[system][process][cmdline_path]}"
        }
        remove_field => "[system][process][cmdline]"
      }
    }
  }

 grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }

}



output {
  elasticsearch {
    hosts => "1.2.1.1:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}

I have couple of log file located at

/root/logs/File.log /root/logs/File2.log

the Log format there is :

08:26:51,753 DEBUG [ABC] (default-threads - 78) (1.2.3.4)(368)>[TIMESTAMP:Wed Sep 11 08:26:51 UTC 2019],[IMEI:03537],[COMMAND:INFO],[GPS STATUS:true],[INFO:true],[SIGNAL:false],[ENGINE:0],[DOOR:0],[LON:90.43],[LAT:23],[SPEED:0.0],[HEADING:192.0],[BATTERY:100.0%],[CHARGING:1],[O&E:CONNECTED],[GSM_SIGNAL:100],[GPS_SATS:5],[GPS POS:true],[FUEL:0.0V/0.0%],[ALARM:NONE][SERIAL:01EE]

in Kibana by default it shows like ethis

https://imgshare.io/image/stackflow.I0u7S https://imgshare.io/image/jsonlog.IHQhp

 "message": "21:33:42,004 DEBUG [LOG] (default-threads - 100) (1.2.3.4)(410)>[TIMESTAMP:Sat Sep 07 21:33:42 UTC 2019],[TEST:123456],[CMD:INFO],[STATUS:true],[INFO:true],[SIGNAL:false],[ABC:0],[DEF:0],[GHK:1111],[SERIAL:0006]"

but i want to get it like bellow :-

"message": {
      "TIMESTAMP": "Sat Sep 07 21:33:42 UTC 2019",
      "TEST": "123456",
      "CMD":INFO,
      "STATUS":true,
      "INFO":true,
      "SIGNAL":false,
      "ABC":0,
      "DEF":0,
      "GHK":0,
      "GHK":1111
    }

Can this be done ? if yes how ? Thanks

alammd
  • 339
  • 4
  • 14
  • Yes it can be done. The easiest way would be to apply the kv filter on the part after the > – baudsp Sep 09 '19 at 08:00

1 Answers1

0

With the if [message] =~ />/, the filters will only apply to messages containing a >. The dissect filter will split the message between the >. The kv filter will apply a key-value transformation on the second part of the message, removing the []. The mutate.remove_field remove any extra field.

filter {
    if [message] =~ />/ {
        dissect {
            mapping => {
                "message" => "%{start_of_message}>%{content}"
            }
        }

        kv {
            source => "content"
            value_split => ":"
            field_split => ","
            trim_key => "\[\]"
            trim_value => "\[\]"
            target => "message"
        }

        mutate {
            remove_field => ["content","start_of_message"]
        }
    }
}

Result, using the provided log line:

{
  "@version": "1",
  "host": "YOUR_MACHINE_NAME",
  "message": {
    "DEF": "0",
    "TIMESTAMP": "Sat Sep 07 21:33:42 UTC 2019",
    "TEST": "123456",
    "CMD": "INFO",
    "SERIAL": "0006]\r",
    "GHK": "1111",
    "INFO": "true",
    "STATUS": "true",
    "ABC": "0",
    "SIGNAL": "false"
  },
  "@timestamp": "2019-09-10T09:21:16.422Z"
}

In addition to doing the filtering with if [message] =~ />/, you can also do the comparison on the path field, which is set by the file input plugin. Also if you have multiple file inputs, you can set the type field and use this one, see https://stackoverflow.com/a/20562031/6113627.

baudsp
  • 4,076
  • 1
  • 17
  • 35
  • Hi Thanks, but i get this error in logstatsh Sep 10 14:03:00 kibana logstash[19858]: [2019-09-10T16:03:00,262][WARN ][org.logstash.dissect.Dissector] Dissector mapping, pattern not found {"field"=>"message", "pattern"=>"%{start_of_message}>%{content}", – alammd Sep 10 '19 at 14:04
  • It means that this message didn't contain a `>`. If you are also receiving messages that don't contains `>`, you'd have to deal with them, either dropping them or not parsing them with the filter I wrote above. – baudsp Sep 10 '19 at 14:52
  • Add those messages to your question, with what you want to do with them and I'll expand my answer. – baudsp Sep 10 '19 at 14:54
  • Thanks @baudsp, i am pushing different type of log file so each log file has different type of structure , it could be that reason, is there anyway to apply filter only for a specifiq log file ? – alammd Sep 10 '19 at 16:36
  • Yes. I'll update my answer with something to this effect. – baudsp Sep 11 '19 at 08:45
  • Hi, Sorry for late reply, please see this screen shot, this is the log i get into kibana https://imgshare.io/image/stackflow.I0u7S with your latest changes, kibana does not process anything, i gave you the screen shot to understand how we getting, Thanks for your help – alammd Sep 11 '19 at 12:42
  • Could you add to your question the configuration you're using and any logstash logs, please? I've taken a look to your screenshot, but I don't have any idea on what's the issue. – baudsp Sep 11 '19 at 14:01
  • thanks, yes, i have edited the my question with more info as you requested, also see the existing kibana format for raw log and json https://imgshare.io/image/stackflow.I0u7S https://imgshare.io/image/jsonlog.IHQhp – alammd Sep 11 '19 at 16:00
  • From the logstash configuration you've posted, you're not using the solution I provided. – baudsp Sep 11 '19 at 16:26
  • yes, when i put that, Logstash does not show any logs at all, shall i remove my config and just put your one ? – alammd Sep 11 '19 at 18:41
  • i edited the question with your filter, but with this logs does not enter into kibana – alammd Sep 11 '19 at 18:51