I'd like to use multiprocessing.Process on a function that takes an ndarray type object in its argument.
def func(array):
do_something_to array
return array
Is this possible?
EDIT: this is the code I tried to implement. The function 'noflux' takes a few ndarray type objects and makes changes to the arguments and this is the function I'd like to run on multiple processors.
for k in np.arange(0,len(t)-1,1):
for j, j in enumerate(gridd):
i = int(j[1])
j = int(j[0])
# NO FLUX BOUNDARIES
if cells[j,i] == 1:
p1 = mp.Process(target=noflux,args=(j,i,cells,rfu,lfu,ufu,dfu,))
p2 = mp.Process(target=noflux,args=(j,i,cells,rfh,lfh,ufh,dfh,))
p1.start()
p2.start()
p1.join()
p2.join()
EDIT 2: Here are the type of objects I pass to the function...
j,i
are integers,
cells
is an ndarray shape (73,87),
rfu,lfu,ufu,dfu
are also (73,87) ndarrays
the function noflux
takes these arrays and modifies them but it does not return anything. There's also a function within noflux that is defined and used so maybe this is what's causing the problem?
EDIT 3: Full error message received when running my code. Swapped the real path to file for "PATH"
Traceback (most recent call last):
File "C:\PATH", line 418, in <module>
p1.start()
File "C:\anaconda\lib\multiprocessing\process.py", line 112, in start
self._popen = self._Popen(self)
File "C:\anaconda\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\anaconda\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\anaconda\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
reduction.dump(process_obj, to_child)
File "C:\anaconda\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
BrokenPipeError: [Errno 32] Broken pipe