0

How can I run multiple processes pool where I process run1-3 asynchronously, with a multi processing tool in python. I am trying to pass the values (10,2,4),(55,6,8),(9,8,7) for run1,run2,run3 respectively?

import multiprocessing 
def Numbers(number,number2,divider):
   value = number * number2/divider
   return value
if __name__ == "__main__":

   with multiprocessing.Pool(3) as pool:               # 3 processes
        run1, run2, run3 = pool.map(Numbers, [(10,2,4),(55,6,8),(9,8,7)]) # map input & output
tony selcuk
  • 709
  • 3
  • 11

2 Answers2

2

You just need to use method starmap instead of map, which, according to the documentation:

Like map() except that the elements of the iterable are expected to be iterables that are unpacked as arguments.

Hence an iterable of [(1,2), (3, 4)] results in [func(1,2), func(3,4)].

import multiprocessing
def Numbers(number,number2,divider):
   value = number * number2/divider
   return value
if __name__ == "__main__":

   with multiprocessing.Pool(3) as pool:               # 3 processes
        run1, run2, run3 = pool.starmap(Numbers, [(10,2,4),(55,6,8),(9,8,7)]) # map input & output
   print(run1, run2, run3)

Prints:

5.0 41.25 10.285714285714286

Note

This is the correct way of doing what you want to do, but you will not find that using multiprocessing for such a trivial worker function will improve performance; in fact, it will degrade performance due to the overhead in creating the pool and passing arguments and results to and from one address space to another.

Booboo
  • 38,656
  • 3
  • 37
  • 60
  • Thanks for the output I have one more post I would appreciate it if you could take a look at it: https://stackoverflow.com/questions/67015521/writing-data-to-multiple-csvs-with-multi-processing-pandas-python . – tony selcuk Apr 09 '21 at 05:28
-1

Python's multiprocessing library does however have a wrapper for piping data between a parent and child process, the Manager which has shared data utilities such as a shared dictionary. There is a good stack overflow post here about the topic.

Using multiprocessing you can pass unique arguments and a shared dictionary to each process, and you must ensure each process writes to a different key in the dictionary.

An example of this in use given your example is as follows:

import multiprocessing


def worker(process_key, return_dict, compute_array):
    """worker function"""
    number = compute_array[0]
    number2 = compute_array[1]
    divider = compute_array[2]
    return_dict[process_key] = number * number2/divider


if __name__ == "__main__":
    manager = multiprocessing.Manager()
    return_dict = manager.dict()
    jobs = []
    compute_arrays = [[10, 2, 4], [55, 6, 8], [9, 8, 7]]
    for i in range(len(compute_arrays)):
        p = multiprocessing.Process(target=worker, args=(
            i, return_dict, compute_arrays[i]))
        jobs.append(p)
        p.start()

    for proc in jobs:
        proc.join()
    print(return_dict)

Edit: Information from Booboo is much more precise, I had a recommendation for threading which I'm removing as it's certainly not the right utility in Python due to the GIL.

  • Yours is a solution in search of the right problem, but this isn't it. And when you have a worker function that is 100% CPU-bound as this one is, threading is *never* the correct approach unless the function is being implemented by, for example, a call to a C language function that releases the Global Interpreter Lock (GIL). There is a *much* simpler solution to the OP's dilemma. The OP's main issue is not in getting return values, for which a pool is the easiest approach and is already being used. The problem is one of how to pass arguments simply. (... more) – Booboo Apr 08 '21 at 14:00
  • The OP could have just manually unpacked them from the list without having to give up using a pool, changing the function signature of the worker function and then having to resort to using a managed dict to get the results as your solution imposes. But as I said, there is even a simpler solution than manually unpacking the arguments from the list. – Booboo Apr 08 '21 at 14:08