1

I Read the various posts about Shared Array and update in the processes: Sharing numpy arrays in python multiprocessing pool

For some reasons, in Windows, the following variable v is not updated at all I also put the name it in Global file, common to the code files, same result ):

The result is zero instead of 50*10 (ie the variable v is not updated at all....)

 import time
 from multiprocessing import Process, Value, Lock
 import multiprocessfunc as mpf #put the multi process function loop there

 if __name__ == '__main__':
     v = Value('i', 0)
     lock = Lock()
     procs = [Process(target=mpf.func, args=(v, lock)) for i in range(10)]

     for p in procs: p.start()
     for p in procs: p.join()

     print(v.value)

in multiprocessfunc file, I put this one:

 def func(val, lock):
     for i in range(50):
         time.sleep(0.01)
         with lock:
             val.value += 1

What do you think about this ?

Update: If I use pool.map to return average/sum value, it works well. This is just the case when passing value/shared array, the processes are not updating the global variable....

I got the same issue when I setup a share array. The array is not updated.

Community
  • 1
  • 1
  • Did you `import time` in your multiprocessfunc.py? If not, that whould explain it, but you should get a bunch of errors (unless you use an ide which does not capture output from subprocesses). – mata Feb 06 '16 at 09:50
  • 3
    Just as suggestion: a [`Value`](https://docs.python.org/3.4/library/multiprocessing.html#multiprocessing.Value) already has it's own lock, no need to allocate your own, just use `with val.get_lock(): ...` – mata Feb 06 '16 at 09:52
  • To top on what @mata wrote, locks are not needed in commutative operations (unless you update the value in a wrongly fashion). – Fanchi Feb 06 '16 at 10:23
  • Lock on value does not work well with processes. Lock are needed, otherwise you get wrong results. Time has nothing to do with this, it can be removed also. –  Feb 06 '16 at 10:58
  • Works for me on Linux – Andrea Corbellini Feb 06 '16 at 16:44
  • Just to be clear: what I meant to say is that this looks to me as a Windows-specific issue, and as such you should look at Windows-specific code. For example, look at the `Arena` class defined in `multiprocessing.heap`: https://hg.python.org/releasing/3.5/file/tip/Lib/multiprocessing/heap.py – Andrea Corbellini Feb 06 '16 at 18:07
  • Thanks for confirming. A multiprocess (going into the sub processus ) debugging is possible ? –  Feb 07 '16 at 16:03

0 Answers0