1

We have a custom command in a django app that performs synchronization of data with an external service.

The command is started hourly.

Usually, the command is finished within half an hour or less, but recently, we ran into a situation where the process took several hours. In the meantime, the command was started several times again in the background, causing inconsistent access to the models (because our code was not designed for this situation).

Is it possible to prevent django from running the command if it is already running?

One way I think of solving this problem is to use a file as a mutex for the command.

But this does not seem very elegant to me, as it could cause any amount of extra trouble in case the command gets interrupted and the file might not be cleaned up properly.

What is the best way to approach this problem? Is there a pythonic / django-ish way to do this?

pusteblume
  • 101
  • 1
  • 8
  • 2
    A file mutex typically holds the process id (PID) of the process that created it, such that the next process can inspect the file, and if no process with that PID exists, "ignore" the mutex (well it creates a new one with its own PID) – Willem Van Onsem Mar 02 '19 at 20:16
  • @WillemVanOnsem thank you, i did not know that. With that, I feel save to use a filelock; but I leave the question open in case there is some other pythonic / django-ish way for this problem. – pusteblume Mar 04 '19 at 09:23
  • The answer is here: https://stackoverflow.com/questions/788411/check-to-see-if-python-script-is-running – kolypto Jan 22 '20 at 09:02

0 Answers0