-1

If I have the following function:

def foo(a, b, c=2, d=6):
    return a + b + c + d

and I want to parallelize with multiprocessing.

How can I pass the following arguments 1, 2, d=10? i.e. 2 args and 1 kwarg?

I saw this post, but it does not seem to make it actually parallel. Another possibly useful example was provided here, but it is hard to untangle.

martineau
  • 119,623
  • 25
  • 170
  • 301
Newskooler
  • 3,973
  • 7
  • 46
  • 84
  • 3
    It's all [clearly documented](https://docs.python.org/3/library/multiprocessing.html#multiprocessing.pool.Pool.apply_async). I doubt you can get a better answer than just reading the docs in this case. – wim Jan 03 '19 at 21:23

2 Answers2

3

How about:

import multiprocessing

def foo(a, b, c=2, d=6):
    return a + b + c + d

def foo_callback(result):
    print(result)

def foo_error(error):
    raise error

pool = multiprocessing.Pool()

for (a, b, c, d) in ((1, 2, 3, 4),
                     (2, 4, 6, 8),
                     (3, 6, 9, 12)):
    pool.apply_async(
        foo,
        args=(a, b),
        kwds={"c":c, "d":d},
        callback=foo_callback,
        error_callback=foo_error
    )
pool.close()
pool.join()

Which prints:

10
20
30

Process finished with exit code 0

I guess that you want to pass the kwds as a dict. The keys need to be strings.

Enda Farrell
  • 778
  • 9
  • 15
1

Use map or imap or async function of both function will be more simple.

from multiprocessing import Pool

def foo(args):
    a, b, c, d = args
    return a + b + c + d # or just sum(args)

with Pool(processes=80) as pool:
    iter = ((1, 2, 3, 4), (2, 4, 6, 8), (3, 6, 9, 12))
    results = pool.map(foo, iter)

print(results)

rymuff
  • 91
  • 3