I want to use multiprocessing.Pool
to load a large dataset, here is the code I'm using:
import os
from os import listdir
import pickle
from os.path import join
import multiprocessing as mp
db_path = db_path
the_files = listdir(db_path)
fp_dict = {}
def loader(the_hash):
global fp_dict
the_file = join(db_path, the_hash)
with open(the_file, 'rb') as source:
fp_dict[the_hash] = pickle.load(source)
print(len(fp_dict))
def parallel(the_func, the_args):
global fp_dict
pool = mp.Pool(mp.cpu_count())
pool.map(the_func, the_args)
print(len(fp_dict))
parallel(loader, the_files)
Interestingly, the length of fp_dict
is changing while the code is running. However, as long as the process terminates, the length of fp_dict
is zero. Why? How can I modify a global variable by using multiprocessing.Pool
?