0

So currently I'm building a log system using ELK Stack. Before building this ELK, I already have custom log format for my apps, so that it can be easily read by human. My log is formatted something like this

Method: POST
URL: https://localhost:8888/api
Body: {
   "field1":"value1",
   "field2":[
       {
            "field3":"value2",
            "field4":"value3"
       },
       {
            "field3":"value2",
            "field4":"value3"
       },
    ]
}

using grok pattern, I can get the Method and the URL, but how can I get the full body json in grok / logstash so that i can send them to elasticsearch? Since the length of the json is not fixed and can be longer or shorter each log

Thank you

hphp
  • 2,142
  • 2
  • 13
  • 26

1 Answers1

0

You can use the JSON Filter. It should parse the JSON for you, and put it into a structured format so you can then send it where ever you need (e.g. Elasticsearch, another pipeline)

From the docs

It takes an existing field which contains JSON and expands it into an actual data structure within the Logstash event.

There are also some other questions here on SO that could be helpful. An example: Using JSON with LogStash

Adam
  • 1,962
  • 2
  • 17
  • 30