I'm developing a system where I'll need execute many Python scripts at same time. I'm using this code to do it:
import os
from multiprocessing import Pool
processes = ('grabber.py', 'mailer.py', 'updater.py')
def run_process(process):
os.system('python {}'.format(process))
pool = Pool(processes=3)
pool.map(run_process, processes)
This code is useful and solve my problem, but the script 'updater.py' is responsible to update the other codes 'grabber.py' and 'mailer.py' doing this steps:
1 - Try to download the news scripts (grabber and mailer) with updates from a remote server 2 - Make a Backup of current scripts 3 - Stop grabber.py and mailer.py 4 - Replace the scripts 5 - Reexecute them again
But, I'm thinking about my scenario. I'm using a Pool of processes to run my scripts. What the best way to sinalize my Pool Processes about this update Process and stop mailer and grabber?
Thank you!