I have a main file that launches multiple processes and one of the processes again launches multiple processes. I am having problems launching the nested set of processes.
I have the following code in one file:
# parallel_test.py
import Queue
import multiprocessing
import time
import threading
def worker(q):
while not q.empty():
try:
row = q.get(False)
print row
time.sleep(1)
except Queue.Empty:
break
def main():
print 'creating queue'
q = multiprocessing.Queue()
print 'enqueuing'
for i in range(100):
q.put(i)
num_processes = 15
pool = []
for i in range(num_processes):
print 'launching process {0}'.format(i)
p = multiprocessing.Process(target=worker, args=(q,))
p.start()
pool.append(p)
for p in pool:
p.join()
if __name__ == '__main__':
main()
Running this file alone python parallel_test.py
works fine and prints the numbers as expected. But launching it from another file as another Process causes problem. My main file:
# main_loop_test.py
import parallel_test
from multiprocessing import Pool
import time
def main():
targets = [parallel_test.main]
running = True
while running:
try:
p = Pool(12)
for target in targets:
p.apply_async(target)
p.close() # For some reason you need to run close() before join()
p.join() # What for all the steps to be done
print 'All steps done'
time.sleep(2)
except KeyboardInterrupt as e:
print "<<<<<<<<<<<<<<<<<<CAUGHT KEYBOARD INTERRUPT FROM USER>>>>>>>>>>>>>>>>>>>"
running = False
if __name__ == '__main__':
main()
It parallel_test.py
seems to try and launch one process (which does nothing) and then exits the function and main_loop_test.py
prints 'All steps done'. No numbers are ever printed. Output:
creating queue
enqueuing
launching process 0
All steps done
creating queue
enqueuing
launching process 0
All steps done
What's going wrong? I get the same problem using Pool
instead of managing the processes myself in parallel_test.py
. Replacing multiprocessing with threading works though.