0

I have this code (it is snippet from my program):

from multiprocessing import Process, Manager, cpu_count, Pool, Value, Lock

def grab_future_products(p, test):
    print("Starting procces %s" % p)

if __name__ == "__main__": # Main program
    n = 4
    test = Value('i', 0)
    pool = Pool(processes=n) # n processes per every CPU core

    for i in range(n):
        pool.apply_async(grab_future_products, args=(i, test))

    pool.close()
    pool.join()

If I run it with python test.py I got no output, no errors, just nothing. I wanted to use variable test as shared integrer between all processes so I can do in another process something like:

if test.value == X:
    break

But interesting is that if I replace args=(i, test)) with args=(i, 1)), it will work as desired. So my question is, why I can not pass Value() object into process? And how I can solve this problem? Many thanks.

linusg
  • 6,289
  • 4
  • 28
  • 78
Peter Jung
  • 235
  • 5
  • 10
  • Nope, still nothing. And I tried `pool.apply_async(grab_future_products(i, test))` , but it will run one proccess after one, not in paraller. – Peter Jung Mar 31 '16 at 19:00
  • Sorry, I deleted my previous comment by mistake...for context, I was saying : "What about `pool.apply_async(grab_future_products, (i, test))` ?" – Tym Mar 31 '16 at 19:01

1 Answers1

1

The trick is to use multiprocessing.Manager, as also mentioned here: Sharing a result queue among several processes:

from multiprocessing import Process, Manager, cpu_count, Pool, Value, Lock

def grab_future_products(p, test):

    print("Starting procces %s, value=%i" % (p, test.value))

if __name__ == "__main__": # Main program
    n = 4
    pool = Pool(processes=n) # n processes per every CPU core
    m = Manager()
    v = m.Value('i', 0)
    for i in range(n):
        res=pool.apply_async(grab_future_products, args=(i, v))
    pool.close()
    pool.join()
Community
  • 1
  • 1
joosteto
  • 198
  • 5