Have a look at this problem, where I wish func2
to be run in the background. In func2
I spawn multiple processes and each of the processes output some data, that at the end are joined. I am not calling func2
directly, but calls func1
WITHOUT join. My intension is for this code to be used in a web application setting, where a user uploads a task to the backend. I will then call func1
that initiates func2
without waiting for func2
to complete. My problem is, as soon as func1
has ended, then all func2
processes are dead before reaching completion. The problem is much more severe if I change range
in func2 into a large number say 1000.
from multiprocessing import Process, Queue
import json
class Test:
def func1(self):
p = Process(target = self.func2)
p.start()
#Note I do not call join here because I want func2 to be executed in the background. It seems as soon as p.start()
#is complete all threads processes spawned by func2 are dead before they complete
def func2(self):
queue = Queue()
queueList = list()
jobList = list()
responseList = list()
for i in range(1000):
p = Process(target = self.work, args = (i, queue))
p.start()
jobList.append(p)
queueList.append(queue)
for q in queueList:
responseList.append(q.get())
for p in jobList:
p.join()
#Do some other stuff with the data
signalPath = 'path_to_somewhere/testProcess.json'
with open(signalPath, 'w') as fileHandle:
json.dump(responseList, fileHandle)
def work(self, i, queue):
print(i)
queue.put(i)
if __name__ == '__main__':
classObj = Test()
classObj.func1()