I need to read a large file and update an imported dictionary accordingly, using multiprocessing
Pool
and Manager
. Here is my code:
from multiprocessing import Pool, Manager
manager = Manager()
d = manager.dict()
imported_dic = json.load(~/file.json) #loading a file containing a large dictionary
d.update(imported_dic)
def f(line):
data = line.split('\t')
uid = data[0]
tweet = data[2].decode('utf-8')
if #sth in tweet:
d[uid] += 1
p = Pool(4)
with open('~/test_1k.txt') as source_file:
p.map(f, source_file)
But it does not work properly. Any idea what am I doing wrong here?