0

I am not an expert in scripting, still learning but I want to create a script in linux that monitor log file, with every line output on logs, search for a keyword and if matched execute the given command and continue monitoring the log file. I wrote a script for this kind of behaviour (sort off) but this will grep the logs from start again after the condition is met. I do not want to start from top in next iteration, I want the script to continue from the the last matched position Here is the script i am using:

#!/bin/sh

while true ; do
    grep -q "$1" /path/to/log_file.log
    if [[ $? == 0 ]]; then
        //run my command here
    else
        printf .
        sleep 1
    fi
done

Any help is appreciated. Thanks.

  • hideously inefficient. you should use `tail -f` to "follow" the end of the log file. As is, you'd be grepping the ENTIRE log file, every time your infinite `while` loop iterates. with the tail option, you'd only ever be scanning newly appended lines, instead of the whole file. – Marc B May 12 '14 at 17:37
  • True, script is insufficient, I tried with tail -f logfile.log | grep -q "$1" but it does nothing – user3629496 May 12 '14 at 17:58
  • Perhaps this can help: [how-to-grep-a-continuous-stream](http://stackoverflow.com/questions/7161821/how-to-grep-a-continuous-stream). For complex processing, I have used perl's [File::Tail](http://search.cpan.org/~mgrabnar/File-Tail-0.99.3/Tail.pm) in the past. Other languages may have similar libraries. – Vivek May 12 '14 at 18:43
  • What are your other requirements? What about dependencies? Python/Perl OK? Maybe you should look at one of the apps used to monitor logfiles like Logwatch and see if they can be used out of the box? – wojciii May 13 '14 at 09:22

2 Answers2

0

The right way to this in bash:

grep "$1" /path/to/log_file.log | while read ; do
    # do whatever you need with $REPLY, for example
    echo "found match: ${REPLY}"
done
Grapsus
  • 2,714
  • 15
  • 14
0
#!/bin/sh

tail -f /path/to/log_file.log |grep "$1"
 #your execution command $output
echo $output

I think this is the easiest way, with -f command you will be able to trace last ten lines in log file, not to parse complete file, also you can put script in cron and output it in some file so you can trace historically warning and critical states from that log.

for example this will run script every day in 2:30 and output in file with current date and hostname of the current system.

30 2 * * * /path/to/scripts.sh > /home/user/`date`_`hostname`_log
klerk
  • 361
  • 1
  • 3
  • 8