-1

I am trying to launch multiple processes to parallelize certain tasks and want one global variable to be decremented by 1 each time each process executes a method X().

I tried to look at the multiprocessing.Value method but not sure if that's the only way to do it. Could someone provide some code snippets to do this ?

from multiprocessing import Pool, Process

def X(list):
  global temp
  print list
  temp = 10
  temp -= 1
  return temp

list = ['a','b','c']
pool = Pool(processes=5)
pool.map(X, list)

With the use of global variable, each process gets its own copy of the global variable which doesn't solve the purpose of sharing it's value. I believe, the need is to have sort of a shared memory system but I am not sure how to do it. Thanks

psbits
  • 1,787
  • 5
  • 19
  • 34
  • [Sharing state between processes](https://docs.python.org/2/library/multiprocessing.html#sharing-state-between-processes) – chepner Sep 15 '15 at 18:02
  • I think you're looking for the Python threading support -- specifically, a semaphore. https://docs.python.org/2/library/threading.html – Prune Sep 15 '15 at 18:05
  • Not really. I want to use multiple processes but also need a shared variable. Threading would solve the shared variable problem but not the other issues for which I need multiple processes. – psbits Sep 15 '15 at 18:12

1 Answers1

1

Move counter variable into the main process i.e., avoid sharing the variable between processes:

for result in pool.imap_unordered(func, args):
    counter -= 1

counter is decremented as soon as the corresponding result (func(arg)) becomes available. Here's a complete code example:

#!/usr/bin/env python
import random
import time
import multiprocessing

def func(arg):
    time.sleep(random.random())
    return arg*10

def main():
    counter = 10
    args = "abc"
    pool = multiprocessing.Pool()
    for result in pool.imap_unordered(func, args):
        counter -= 1
        print("counter=%d, result=%r" % (counter, result))

if __name__ == "__main__":
    main()

An alternative is to pass multiprocessing.Value() object to each worker process (use initialize, initargs Pool()'s parameters).

jfs
  • 399,953
  • 195
  • 994
  • 1,670
  • Thanks J.F. How can I pass multiple arguments using map_unordered ? Do I need any explicit lock in this approach ? – psbits Sep 15 '15 at 18:44
  • @psbits: you don't need a mutex here; `counter` is accessed only in a single process (the main process) -- `counter` even may be a local here. [Multiple args is a different question](http://stackoverflow.com/q/5442910/4279) – jfs Sep 15 '15 at 18:50
  • Not sure, this approach doesn't seem to be working for me or if i am doing something wrong. I declared the counter in my main function and called my func which returns some string. But the counter value still remains equal to the original value. – psbits Sep 15 '15 at 20:40
  • Thanks J.F . Got it working. But I ended up using - multiprocessing.Value() object to each worker process (use initialize, initargs Pool()'s parameters) – psbits Sep 15 '15 at 21:02