0

Well, I'm quite new to python and multiprocessing, and what I need to know is if there is any way to make active processes wait for something like "all processes have finished using a given resource", then continue their works. And yes, I really need them to wait, the main purpose is related to synchronization. It's not about finishing the processes and joining them, it's about waiting while they're running, should I use something like a Condition/Event or something? I couldn't find anything really helpful anywhere.

It would be something like this:

import multiprocessing

def worker(args):
    #1. working
    #2. takes the resource from the manager
    #3. waits for all other processes to finish the same step above
    #4. returns to 1.

if __name__ == '__main__':
    manager = multiprocessing.Manager()
    resource = manager.something()
    pool = multiprocessing.Pool(n)
    result = pool.map(worker, args)
    pool.close()
    pool.join()

Edit: The "working" part takes a lot more time than the other parts, so I still take advantage of multiprocessing, even if the access to that single resource is serial. Let's say the problem works this way: I have multiple processes running a solution finder (an evolutionary algorithm), and every "n" solutions made, I use that resource to exchange some data between those processes and improve solutions using the information. So, I need all of them to wait before exchanging that info. It's a little hard to explain, and I'm not really here to discuss the theory, I just want to know if there is any way I could do what I tried to describe in the main question.

John Y
  • 14,123
  • 2
  • 48
  • 72

2 Answers2

0

I'm not sure, that I understood your question. But I think you can use Queue. It's good solution to transmit data from one process to another. You can implement something like:

1. Process first chunk
2. Write results to queue
3. Waits until queue is not full
4. Returns to 1
Jimilian
  • 3,859
  • 30
  • 33
  • But how do I make sure that the worker processes will use the queue in a defined order or at least wait till every other process has finished its part? – user3773312 Nov 10 '14 at 10:44
  • Do you need to transmit data from one process to another? If not, you can try first solution from http://stackoverflow.com/questions/26821856/improve-speed-of-python-script-multithreading-or-multiple-instances/26826403#26826403 – Jimilian Nov 11 '14 at 06:55
0

I actually found out a way to do what I wanted. As you can see in the question, the code was using a manager along the processes. So, in simple words, I made a shared resource which works basically like a "Log". Every time a process finishes its work, it writes a permission in the log. Once all the desired permissions are there, the processes continue their works (also, using this, I could set specific orders of access for a resource, for example). Please note that this is not a Lock or a Semaphore. I suppose this isn't a good method at all, but it suits the problem's needs and doesn't delay the execution.