0

I have a script that writes to a file and then dumps that file to a database. I need this task to run as frequently as possible, but never run more than instance at the same time (or it'd be writing redundant stuff to the same file).

How I am currently doing it is in the shell script I am checking to see if a file exists, and if it does, I exit the script. At the end of each script it deletes the file.

This works 95% of the time. However, if the server is restarted (which happens semi-frequently), the file that was being written to will remain, and every time the script is called after that it will exit because the file already exists.

What would be a good way around this problem?

David542
  • 104,438
  • 178
  • 489
  • 842
  • What about scheduling a file comprobation on crontab for every reboot? It is the `@restart` condition – fedorqui Apr 12 '13 at 20:53
  • see [this question](http://stackoverflow.com/questions/185451/quick-and-dirty-way-to-ensure-only-one-instance-of-a-shell-script-is-running-at) which is pretty much your condition in a nutshell – Anya Shenanigans Apr 12 '13 at 22:17

1 Answers1

0

You could check to see if any processes are using the file with 'fuser'. It will return the PID of any program using the file. If there are no PIDS, you are safe to wipe it and start again.

Tony
  • 761
  • 4
  • 9
  • Thank you, what would be the `grep` or `fuser` command to check if the file "example.txt" is being used by user "root" ? – David542 Apr 12 '13 at 21:06
  • "fuser example.txt" though it's always best to include the full path to the file – Tony Apr 12 '13 at 21:26