i have a list of functions that does some job like download html from a url(each function is very different so i can't make a single function to accept url and downlaod). i have used multiprocessing to speed up the task. below is my code
def runInParallel(list_of_functions):
for fn in list_of_functions:
proc = [Process(target=fn[1]).start() for fn in list_of_functions]
for p in proc:
p.join()
what i want is how to store the result that each function returns? each function returns a dict that i need to parse and store in database and i dont want to repeat these steps in each function so what i want is some sort of callback that can be passed with results returned from fucntions. how can i achieve that?
EDIT: using pool
but throws error. i have following for list_of_functions
:
[('f1', <function f1 at 0x7f34c11c9ed8>), ('f2', <function f2 at 0x7f34c11c9f50>)]
def runInParallel(list_of_functions):
import multiprocessing
pool = multiprocessing.Pool(processes = 3)
x = pool.map(lambda f: f(), list_of_functions)
print x
File "main.py", line 31, in <module>
runInParallel(all_functions)
File "main.py", line 11, in runInParallel
x = pool.map(lambda f: f(), list_of_functions)
File "/usr/lib/python2.7/multiprocessing/pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "/usr/lib/python2.7/multiprocessing/pool.py", line 558, in get
raise self._value
cPickle.PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed