I have a Perl script that
- queries a database for a list of files to process
- processes the files
- and then exits
Upon startup this script creates a file (let's say script.lock), and upon exit it removes this file. I have a crontab entry that runs this script every minute. If the lockfile exists then the script exits, assuming that another instance of itself is running.
The above process works fine but I am not very happy with the robustness of this approach. Specifically, if for some reason the script exits prematurely and the lockfile is not removed then a new instance will not execute properly.
I would appreciate some advice on the following:
- Is using the lock file a good approach or is there a better/more robust way to do this?
- Is using crontab for this a good idea or could I better write an endless loop with sleep()?
- Should I use the GNU 'daemon' program or the Perl Proc::Daemon module (or some other equivalent) for this?