I Read the various posts about Shared Array and update in the processes: Sharing numpy arrays in python multiprocessing pool
For some reasons, in Windows, the following variable v
is not updated at all I also put the name it in Global file, common to the code files, same result ):
The result is zero instead of 50*10 (ie the variable v is not updated at all....)
import time
from multiprocessing import Process, Value, Lock
import multiprocessfunc as mpf #put the multi process function loop there
if __name__ == '__main__':
v = Value('i', 0)
lock = Lock()
procs = [Process(target=mpf.func, args=(v, lock)) for i in range(10)]
for p in procs: p.start()
for p in procs: p.join()
print(v.value)
in multiprocessfunc file, I put this one:
def func(val, lock):
for i in range(50):
time.sleep(0.01)
with lock:
val.value += 1
What do you think about this ?
Update: If I use pool.map to return average/sum value, it works well. This is just the case when passing value/shared array, the processes are not updating the global variable....
I got the same issue when I setup a share array. The array is not updated.