Being a Linux
and Bash noob, sorry if the question is too basic or some critical information is missing.
Basically, I am using the python-celery library
(with redis broker) to process some massive json
files chunk by chunk parallely.
Right now, the way I achieve this is (let me know if there is a better way)
Terminal 1
$celery worker -A celery_worker -l info -c 64 #To start the celery worker service which listens for input
Terminal 2
$./pythonScript.py arg1 arg2 #Script to call the service
After completion of the task in Terminal 2, I send a keyboard interrupt to terminal 1 by Ctrl+C and the worker shuts down.
But how do I achieve the whole thing just by running a single bash script in a terminal, i.e. start the celery_worker
service, run the python script and then shut down the worker?
As dirty work-around, I can keep the worker running 24*7
without shutting down the worker, and just use the python script whenever I need the service. But I believe that will waste memory, and celery workers have an annoying tendency to shut down on their own after a while.
I also need to add the whole procedure to a daily cron job.