0

I am using LogStash which accepts data from a log file, which has different types of logs.

The first row represents a custom log, whereas the second row represents a log in JSON format.

Now, I want to write a filter which will parse the logs on the basis of content and finally direct all the JSON format logs to a file called jsonformat.log and the other logs into a seperate file.

rohansingh
  • 327
  • 3
  • 12

1 Answers1

2

You can leverage the json filter and check if it failed or not to decide where to send the event.

input {
   file {
       path => "/Users/mysystem/Desktop/abc.log"
       start_position => beginning
       ignore_older => 0
   }
}
filter {
   json {
     source => "message"
   }
}
output {
  # this condition will be true if the log line is not valid JSON
  if "_jsonparsefailure" in [tags] {
    file {
       path => "/Users/mysystem/Desktop/nonjson.log"
    }
  }
  # this condition will be true if the log line is valid JSON
  else {
    file {
       path => "/Users/mysystem/Desktop/jsonformat.log"
    }
  }
}
Val
  • 207,596
  • 13
  • 358
  • 360
  • I don't know why but though your code passes the configuration test, when I run it on my input file, it just shows that the pipeline has started, but it doesn't create the log files. – rohansingh Aug 19 '16 at 04:32
  • You either need to add new lines to your `abc.log` file or add `sincedb_path => "/dev/null"` to your file input. – Val Aug 19 '16 at 04:41
  • Yes, I did that. And I even tried adding new lines, but it still doesn't make new files. I am posting my config file for you to have a look at. And also, the input file. (Mentioned as part of the edit in the question) – rohansingh Aug 19 '16 at 04:43
  • Can you rename your abc.log file to some other name? And maybe even move it to another folder – Val Aug 19 '16 at 04:44
  • Tried renaming, and moving it to a new folder altogether (yes, I changed the path in the config file), but it still doesn't work. Just shows that the pipeline has started. Also, can you have a look at the edit of the question? – rohansingh Aug 19 '16 at 04:48
  • Can you remove all `.sincedb*` files that you can find in your home folder (`~/.sincedb*`)? – Val Aug 19 '16 at 04:56
  • I did that, but it still doesn't work. I don't know what is the problem. I run logstash using the following command, `sudo bin/logstash -f first-pipeline.conf`. Is there a possibility I am doing something wrong here? – rohansingh Aug 19 '16 at 05:04
  • Oh, just remove sudo ;) – Val Aug 19 '16 at 05:14
  • I think there seems to be some problem in my version of LogStash itself. I just wanted to know, since I am relatively new to LogStash, can you guide me in where can I see the logs of LogStash itself? Maybe that might help me. – rohansingh Aug 19 '16 at 05:40
  • So it doesn't work either without sudo? The problem with sudo is that root runs logstash and that induce all kinds of problems when reading files. Logstash doesn't produce too many logs, however, you can start it with the `--debug` flag in order top produce some more debugging output that we can use. – Val Aug 19 '16 at 05:43
  • Hi. It worked finally. Thanks a lot. However, on execution, it shows the following error: `Error parsing json {:source=>"message", :raw=>"tag: ghf ,message: bye, value: 56", :exception=>#, :level=>:warn} `. This is shown right after it says, pipeline main started. – rohansingh Aug 19 '16 at 05:55
  • Yes, that's normal because the json filter fails to parse non-JSON data. However, that's just a warning you see, but nothing that will halt Logstash. – Val Aug 19 '16 at 06:13
  • I am trying to solve this with the help of regular expressions, maybe the grok plugin. Can you guide me in the right direction for that as well? – rohansingh Aug 19 '16 at 08:27
  • alright. Can I mark your answer as accepted and then, can you edit your existing answer and post the way to do it using regex as well? It will be a great help. Thanks :) – rohansingh Aug 19 '16 at 08:34
  • Sure, go ahead and then you can ask another question in another thread and link it here – Val Aug 19 '16 at 08:34
  • thanks. I have done it. Please add the answer using regular expressions as well. – rohansingh Aug 19 '16 at 08:35
  • here is the link to the question: http://stackoverflow.com/questions/39034834/how-to-use-regex-for-config-files-in-this-use-case – rohansingh Aug 19 '16 at 08:51
  • can you please have a look at the attached question? Thanks! – rohansingh Aug 19 '16 at 09:04
  • Yes, though I'm still not clear why you don't like the above solution – Val Aug 19 '16 at 09:05
  • so, I do, but, I am still trying to solve it through regular expressions since I am learning regular expressions with logstash. So, I have tried solving the problem in the other link and made the regex as follows: `match => { "message" => "%{WORD:tag} %{WORD:message} %{WORD:value}` but this doesn't work, and moreover, I am not clear on how to seperate it into two files using this. – rohansingh Aug 19 '16 at 09:13
  • You're trying to put squares into triangles :-) Since your log file contains different formats, grokking is not the best way out of it. You should learn regexp on another example in my opinion – Val Aug 19 '16 at 09:15
  • Oh! Alright. Then how can I solve that using regular expressions? I would want to learn about regular expressions in solving this as well. I can do it using another example, but, since most of the log files have mixed formats for my use case, I have tried to form a minimalistic example of the entire thing. It will be great if you can help me out with this, so that I can use that knowledge for more complex scenarios. – rohansingh Aug 19 '16 at 09:23
  • can you please have a look at it? – rohansingh Aug 19 '16 at 09:54
  • No worries, I'll look at it shortly – Val Aug 19 '16 at 09:56
  • please help me in solving the problem using regex though. A solution through grok as you pointed out correctly won't work, since grok doesn't work with json. Thanks :) – rohansingh Aug 19 '16 at 09:58