I would like be able to create new multiprocessing.Value
or multiprocessing.Array
after process start. Like in this example:
# coding: utf-8
import multiprocessing
shared = {
'foo': multiprocessing.Value('i', 42),
}
def job(pipe):
while True:
shared_key = pipe.recv()
print(shared[shared_key].value)
process_read_pipe, process_write_pipe = multiprocessing.Pipe(duplex=False)
process = multiprocessing.Process(
target=job,
args=(process_read_pipe, )
)
process.start()
process_write_pipe.send('foo')
shared['bar'] = multiprocessing.Value('i', 24)
process_write_pipe.send('bar')
Ouput:
42
Process Process-1:
Traceback (most recent call last):
File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
self.run()
File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "/home/bux/Projets/synergine2/p.py", line 12, in job
print(shared[shared_key].value)
KeyError: 'bar'
Process finished with exit code 0
Problem here is: shared
dict is copied into process
when it's start. But, if i add a key in shared
dict, process can't see it. How this started process
can be informed about existence of new multiprocessing.Value('i', 24)
?
It can't be given thought pipe because:
Synchronized objects should only be shared between processes through inheritance
Any idea ?