I have many python functions (each one is a member function of a different class), and wish to run x of them in parallel at every moment (unless there are less than x left).
In other words, I would like to have a queue of all tasks that should be performed and x subprocesses. The main process will pop tasks from the queue till it is empty, and give them to subprocess to perform. The subprocess will inform the main process when they are free and get another task.
I thought of using multiprocess module, but not sure how to know when each subprocess is finished and ready for the next task.
I tried to use the shared queue - fill it class objects with the main process and on each subprocess do something like:
def subprocess(shared_queue):
while not shared_queue.empty():
class_obj = shared_queue.pop()
class_obj.main_func()
However, it turns out I can't fill the queue with my complex classes.
Edit: I think pool wouldn't work as I want to run many different functions, each of them once. The examples I saw with pool run one function many times, with different parameters.
Edit 2: Pool will work for the original problem, by passing functions as parameters, as suggested by comments. But I still want a solution where I manage the queue, because later on I would like to give each task a weight and run tasks such that their sum of weights doesn't cross a threshold. So I still need to know when subprocesses finish their task.