2

I need to know what is the best tools to monitor multiple log files suitable to my needs.

What I need :

  • Send and monitor multiple log files at once
  • Support real-time viewing
  • Better graphical User Interface whenever possible
  • Capable of searching or filtering the logs
  • Setup with minimum effort whenever possible

I have gathered some tools, they are :

  1. multitail, a simple multiple log files viewer, but I prefer a better graphical UI
  2. lnav, it's like multitail, but I don't know what's the pros and cons between the two
  3. FrontTail, I think this is much better than the previous two
  4. GrayLog, I have used this once, it has great searching or filtering features, but it was setup by someone, not sure if the setup would be complex or not
  5. LogStash, I never use it, but many people say it's great, but is it easy to setup?

the source of logs coming from these :

  • gridpane.com log
  • nginx access log
  • nginx error log
  • PHP error log
  • MySQL query log
  • MySQL error log
Budianto IP
  • 1,118
  • 1
  • 13
  • 27
  • 1
    You can use ELK stack to monitor your log files. ELK is highly rich with plugins and they are very easy to use. It also supports monitoring multiple log files. Beats (light shipper for logs) in combination with ELK stack becomes very powerful and ease to use log monitoring. – Sourav May 17 '20 at 04:28
  • Ok, I'm gonna check it out, thanks – Budianto IP May 17 '20 at 05:32
  • For the source logs you mentioned: there are lots of modules available in Beats to monitor the source logs you mentioned. Like for example, to monitor nginx logs, you can use Metricbeat modules ‘nginx’ to easily create monitoring dashboard without any hassle. – Sourav May 17 '20 at 07:33
  • I think the answer also depends on the need of parsing: do you just need to see the raw logs as text, or do you need the tool to parse it and understand each log entry field? The latter would help to perform a more efficient search/filtering (each tool function could then be performed on a specific field, like "Thread", "Message", "Date", ...). The 3 first tools you mentioned don't do that, so easier to set up and less powerful. The last 2 do parse the logs, so a bit more complex to set up, but more powerful. I would also add [LogMX](https://logmx.com) in the last group of parsing tools. – xav May 17 '20 at 12:53
  • Thank you xav, but I have decided to follow what sourav19 suggested, in fact, I have finished setting it up, took me about 8-10 hours though, but overall I'm satisfied, credits to sourav19, thanks man, you saved my day! – Budianto IP May 17 '20 at 16:11

2 Answers2

3

If you are looking to occasionally scan multiple log files on a single host, then lnav should work well.

  • Send and monitor multiple log files at once

lnav processes the log files on the host directly, it is not a service that consumes the files. It is also able to monitor multiple logs at once and collate the messages into a single view. (Note: There is a comment by @xav above that says that lnav does not parse the files, this is incorrect and I don't know where they got that impression.)

  • Support real-time viewing

lnav works in a "live" fashion for most operations. It is always polling files for new data. For example, if there is a search active, any new data will be searched as it is loaded.

  • Better graphical User Interface whenever possible

lnav is a text-based user-interface, but it is still quite friendly. There is online help, tab-completion, syntax-highlighting, previews for commands, and so on.

  • Capable of searching or filtering the logs

Logs can be searched and filtered using perl-compatible regular expressions (PCRE), by time, and/or by log level. The filters are automatically applied as new data is loaded. The next release, v0.9.1, will feature filtering using a SQL expression.

You can also use SQLite SQL to analyze log messages. For example, if you wanted to find the URIs with the largest bytes transferred from an Apache web access log, you can execute:

SELECT cs_uri_stem, max(sc_bytes) AS max_bytes
   FROM access_log
   GROUP BY cs_uri_stem
   ORDER BY max_bytes desc
  • Setup with minimum effort whenever possible

lnav has several log formats builtin and will auto-detect the correct format for a given log file. If a format is not supported, you can write a JSON file that describes the format. Using lnav is almost always just a matter of running:

$ lnav /path/to/logs/
2

I finally found the one that suits my needs.

I'm sharing this in case anyone who wants to use the same solution.

Thanks to sourav19, I followed your advice, even though it took me 8-10 hours to install and configure everything, but it's really what I want.

I had to buy a Digital Ocean droplet, cost me $20 to get a 4 GB of RAM, but I think it's much cheaper than buying the other log monitoring applications which are way too expensive.

Before installing docker, we have to enable Virtual Private Cloud (VPC), we will use the provided IP Address for our docker containers, so they can communicate between each other, by following this article.

I used a dockerized ELK, link is here

All we need to do is to clone the dockerized ELK to our server, and then go inside the cloned folder, and build the Dockerfile

docker run -p 5601:5601 -p 9200:9200  -p 5044:5044 \
-v /var/log:/var/lib/elasticsearch --name elk sebp/elk

Then, open kibana, in the website, HTTP://your_site:5601

after that, install the Filebeat into the other server which having the log files you want to monitor, this Filebeat will send the logs to Kibana, by following this instructions, and then configure it here.

if everything is okay, we will see the logs in the Kibana.

Budianto IP
  • 1,118
  • 1
  • 13
  • 27