0

I have to create a bash script that check if there are other same scripts in execution. To do that I have implemented this solution

scriptToVerify="sl_dynamic_procedure.sh_${1}";
LOCKFILE=${SL_ROOT_FOLDER}/work/$scriptToVerify
if [ -e ${LOCKFILE} ] && kill -0 `cat ${LOCKFILE}`; then
  sl_log "---------------------------Warning---------------------------"
  sl_log "$scriptToVerify already in execution"
  exit
fi
trap "rm -f ${LOCKFILE}; exit" INT TERM EXIT
echo $$ > ${LOCKFILE}

I have addedd ${1} because my script has got a parameter. If I try to execute a script without a parameter (without ${1}) it works correctly. If I try to execute more than once the script with the parameter sometimes works and sometimes not. How can I fix my code?

  • 1
    Your code has a race. For alternative solutions, see: https://stackoverflow.com/questions/185451/quick-and-dirty-way-to-ensure-only-one-instance-of-a-shell-script-is-running-at (but don't use the first answer - it is wrong) – jhnc Mar 04 '19 at 18:01

1 Answers1

0

First, did you want to allow the script to execute even if another copy is running, so long as they have different arguments? Without knowing what the script does and what the argument is I can't know if that's sensible, but in general it looks like you're buying trouble.

Second, using a lockfile is common, but subject to a race condition. Much better to make the creation of the lock and the test for it a single atomic action. This is almost impossible with a file, but is really easy with a directory.

myLock=/tmp/ThisIsMySingleLockDirectoryName
lockMe()   { mkdir $myLock 2>&-; }
unlockMe() { rmdir $myLock 2>&-; } 

if lockMe
then : do stuff
     unlockMe
else echo "Can't get a lock."
     exit 1
fi

This is simplistic, as it throws away the stderr, and doesn't test reasons... but you get the idea.

The point is that the creation of the directory returns an error if it already exists.

Paul Hodges
  • 13,382
  • 1
  • 17
  • 36