1

Being a Linux and Bash noob, sorry if the question is too basic or some critical information is missing.

Basically, I am using the python-celery library (with redis broker) to process some massive json files chunk by chunk parallely.

Right now, the way I achieve this is (let me know if there is a better way)

Terminal 1

$celery worker -A celery_worker -l info -c 64 #To start the celery worker service which listens for input

Terminal 2

$./pythonScript.py arg1 arg2 #Script to call the service

After completion of the task in Terminal 2, I send a keyboard interrupt to terminal 1 by Ctrl+C and the worker shuts down.

But how do I achieve the whole thing just by running a single bash script in a terminal, i.e. start the celery_worker service, run the python script and then shut down the worker?

As dirty work-around, I can keep the worker running 24*7 without shutting down the worker, and just use the python script whenever I need the service. But I believe that will waste memory, and celery workers have an annoying tendency to shut down on their own after a while.

I also need to add the whole procedure to a daily cron job.

MLavoie
  • 9,671
  • 41
  • 36
  • 56
Della
  • 1,264
  • 2
  • 15
  • 32

1 Answers1

1

If I understand you right, Terminal 2 is dependent on Terminal 1. Ok, so what you can do is write a Bash script that first checks whether the celery worker is running. If the worker is not running, then start the worker as a background process and continue to the next step. If the worker is running, then proceed to the next step directly.

The next step from above consists of running the Python script.

After that is done, find the worker process, get its PID (process id), and kill it. This is not the cleanest way, but it corresponds to what you are doing right now with Ctrl + C.

For setting up cronjobs check out this page.

Socrates
  • 8,724
  • 25
  • 66
  • 113
  • Thanks, but what is a clean way to shutdown the celery worker without a keyboard interrupt? I know, killing sounds like an ungraceful exit, but how do I leave a clean slate when the process is waiting for task queues? – Della Jan 26 '19 at 09:11
  • The clean way to do it would be to find out how to access some exit command in the celery worker, if there is one. This sounds promissing: http://docs.celeryproject.org/en/latest/userguide/workers.html#stopping-the-worker There is actually nothing wrong with Ctrl + C, if the celery worker is built properly. Problems can occur when the worker is in the middle of doing something and "just leaves his work undone", hence keeps resources open or half processed. Judging your use case, I believe this is not a problem in your case. – Socrates Jan 26 '19 at 18:41