I can use multiprocessing to easily set up parallel calls to "func" like this:
import multiprocessing
def func(tup):
(a, b) = tup
return str(a+b)
pool = multiprocessing.Pool()
tups = [ (1,2), (3,4), (5,6), (7,8)]
results = pool.imap(func, tups)
print ", ".join(results)
Giving result:
3, 7, 11, 15
The problem is that my actual function "func" is more complicated than the example here, so I don't want to call it with a single "tup" argument. I want multiple arguments and also keyword arguments. What I want to do is something like below, but the "*" unpacking inside a list doesn't work (and doesn't support keywords either):
import multiprocessing
def func(a, b):
return str(a+b)
pool = multiprocessing.Pool()
tups = [ *(1,2), *(3,4), *(5,6), *(7,8)]
results = pool.imap(func, tups)
print ", ".join(results)
So... is there a way to get all the power of python function calls, while doing parallel processing?