0

I'm looking to do something like this:

def func1(a,b,c)
    # Does some complex things
    return processed_output

def func2(a,b,c)
    # Does some else that is also complex things
    return processed_output

def parallelWrapperFunction(func1(a,b,c), func2(e,f,g))
    # Runs everything in parallel
    return func1_output, func2_output

if __name__ == '__main__':
    # Get function inputs a,b,c,d,e,f
    parallelWrapperFunction(func1(a,b,c), func2(e,f,g))

How do I go about setting this up? I'd like to use separate cores so I can bypass the global interpreter lock. I've seen suggestions to use the multiprocessing package in python. Just need to know how to go about doing this.

  • Possible duplicate of [Python: Executing multiple functions simultaneously](https://stackoverflow.com/questions/18864859/python-executing-multiple-functions-simultaneously) – zwer Jun 06 '17 at 17:29
  • I've tried that code actually, but i don't know how pass arguments and get the return values back. – Stacy Garfield Jun 06 '17 at 17:34
  • You can use the `args` argument of `Process` to pass arguments to your function, and you can pass `multiprocessing.Queue` or `multiprocessing.Manager` alongside so your functions can write to them instead of returning a value. There are examples of all that available in the [official documentation](https://docs.python.org/2/library/multiprocessing.html). – zwer Jun 06 '17 at 17:41
  • Thanks zwer, I'll try to figure it out from here. – Stacy Garfield Jun 06 '17 at 17:47

0 Answers0