0

I have a multiprocessing function that takes a lot of input arguments of which all but one are the same across the processes. That is, my code would ideally look something like:

def Function(A,B,C,D):
  Do Something with A, B, C & D
  Return Result

if __name__ == '__main__':
  Workers = multiprocessing.cpu_count()
  pool = multiprocessing.Pool(processes=Workers)
  Output = pool.map(Function,A,B,C,D)

Only A changes from process to process i.e. B,C and D are global variables of different sizes and shapes. I currently have to evaluate these variables within the function as:

def Function(A):
  A = Code...
  B = Code...
  C = Code...
  Do Something with A, B, C & D
  Return Result

if __name__ == '__main__':
  Workers = multiprocessing.cpu_count()
  pool = multiprocessing.Pool(processes=Workers)
  Output = pool.map(Function,A)

This means that my code in theory could be lot faster if I could evaluate these variables globally and then pass them to pool.map(). Is there any way to do this?

  • 2
    Use starmap or functions.partial. https://stackoverflow.com/questions/5442910/python-multiprocessing-pool-map-for-multiple-arguments – Eric Truett May 10 '20 at 17:48
  • so why not "evaluate the variables globally"? – shx2 May 10 '20 at 17:50
  • How would I pass the variables to this? A is say an array of numbers that each process will use a different entry of, while B,C and D are single numbers. @EricTruett – Randomgenerator May 10 '20 at 17:52
  • Randomgenerator: Use [`functools.partial`](https://docs.python.org/3/library/functools.html#functools.partial) to create a function with the arguments and pass that to `pool,map()` as the `func` argument. – martineau May 10 '20 at 18:06

0 Answers0