Is there a way to pass a nested dictionary to multiprocessing?
d = {'a': {'x': 1, 'y':100},
'b': {'x': 2, 'y':200}}
I was hoping to start two parallel jobs, one for {'a': {'x':1, 'y':100}}
and another for {'b': {'x': 2, 'y':200}}
, and use the following function to create a new dictionary
def f(d):
key = dd.keys()
new_d[key]['x'] = d[key]['x']*2
new_d[key]['y'] = d[key]['y']*2
This was my unsuccessful attempt
import multiprocessing
def f(key, d, container):
container[key]['x'] = d[key]['x']*2
container[key]['y'] = d[key]['y']*2
if __name__ == '__main__':
manager = multiprocessing.Manager()
container = manager.dict()
d = manager.dict()
d['a'] = {'x': 1, 'y':100}
d['b'] = {'x': 2, 'y':200}
p1 = multiprocessing.Process(target=f, args=('a',d, container))
p2 = multiprocessing.Process(target=f, args=('b',d, container))
p1.start()
p2.start()
p1.join()
p2.join()
I get a KeyError: 'b'
and also, I would like to avoid having to specify the number of processes manually, like p1
and p2
and so on. Is there maybe another way?