0

I'm monitoring a log file. Each line has the following format:

2012    5       29      14      20              438.815 872.737 -1.89976       -0.55156     8.68749 -0.497848       -0.54559                0       0       6      00       0       0       0               0       0       0       0       0      80       9               0       0       10      0       0       0       8      00       9       0       0       0       0       0       0               2      41       84      0       0       0       1       0

As you can see, each value is delimited by a tab.

How can I write a Perl script to take each new line of data (the log file is updated every ten minutes) and insert this data into a MySQL database?

I'd like to do as much of this as possible on the command line.

If I do tail -f -n 1 ./Eamorr.out > myPerlScript.pl, will my perl script get data each time the file is appended to?

Many thanks in advance,

Levon
  • 138,105
  • 33
  • 200
  • 191
Eamorr
  • 9,872
  • 34
  • 125
  • 209
  • Does it have to be perl? You could also do this using just a shell script, along with your database command line client (mysql, psql, etc). – ghoti May 29 '12 at 16:24

2 Answers2

2

Another aproach in bash :

#!/usr/bin/perl -w

use strict;

$|++; # unbuffer output

open FH, "tail -f /var/log/syslog |";

while (<FH>) { chomp; print; }

Without Perl in pure shell :

tail -f /var/log/syslog |
    while read a; do
        echo "INSERT INTO FOOBAR VALUES($(
            sed "s/ /','/g; s/^/'/; s/$/'/" <<< "$a")
        );"
    done
Gilles Quénot
  • 173,512
  • 41
  • 224
  • 223
  • I recommend using `tail -F` instead of `-f`, just in case your log file gets rotated... – ghoti May 29 '12 at 16:23
1

If this is the approach you want to take, you need a pipeline, like:

tail -f -n 1 ./Eamorr.out | myPerlScript.pl

where myPerlScript.pl reads the incoming lines like:

while (<>) {
chomp;
print "Handling: $_\n";

}

JRFerguson
  • 7,426
  • 2
  • 32
  • 36