Background: I'm writing a program that reads video frames from multiple cameras concurrently. I'd like to have 1 process that performs the frame reads and a 2nd process that writes those frames to disk. I've been trying to identify the best way (in Python 3.6) to make the frames in the "read" process available to the "write" process for saving. I've landed on shared memory as the best option.
Problem: When I allocate shared memory for both processes, changes that the parent process makes in the shared memory space are not seen by the child process.
Previous Effort: I've been trying the method suggested by Is shared readonly data copied to different processes for multiprocessing?. However, pasting this code directly into Atom and trying to run it under Python 2.7 or Python 3.6 does not produce the same result that the linked answer provides (I.e the parent process does not see the changes made by my_func
in the child process)
Note:
import multiprocessing
import ctypes
import numpy as np
shared_array_base = multiprocessing.Array(ctypes.c_double, 10*10)
shared_array = np.ctypeslib.as_array(shared_array_base.get_obj())
shared_array = shared_array.reshape(10, 10)
# Parallel processing
def my_func(i, def_param=shared_array):
shared_array[i,:] = i
if __name__ == '__main__':
pool = multiprocessing.Pool(processes=4)
pool.map(my_func, range(10))
print shared_array