0

I would like be able to create new multiprocessing.Value or multiprocessing.Array after process start. Like in this example:

# coding: utf-8
import multiprocessing

shared = {
    'foo': multiprocessing.Value('i', 42),
}


def job(pipe):
    while True:
        shared_key = pipe.recv()
        print(shared[shared_key].value)

process_read_pipe, process_write_pipe = multiprocessing.Pipe(duplex=False)

process = multiprocessing.Process(
    target=job,
    args=(process_read_pipe, )
)
process.start()

process_write_pipe.send('foo')

shared['bar'] = multiprocessing.Value('i', 24)
process_write_pipe.send('bar')

Ouput:

42
Process Process-1:
Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/home/bux/Projets/synergine2/p.py", line 12, in job
    print(shared[shared_key].value)
KeyError: 'bar'

Process finished with exit code 0

Problem here is: shared dict is copied into process when it's start. But, if i add a key in shared dict, process can't see it. How this started process can be informed about existence of new multiprocessing.Value('i', 24) ?

It can't be given thought pipe because:

Synchronized objects should only be shared between processes through inheritance

Any idea ?

bux
  • 7,087
  • 11
  • 45
  • 86

1 Answers1

0

It looks like you are assuming the shared variable is accessable by both threads. Only the shared["foo"] variable is accessable by both threads. You need to share a dictionary.

Here is an example: Python multiprocessing: How do I share a dict among multiple processes?

Jordan
  • 358
  • 1
  • 9
  • Answer is correct but i got performance issue: manager.dict() make the job but data are pickled. Processes are not just informed about new memory shared blocks. – bux Aug 30 '17 at 08:10