3

I'm trying to find a simple example that clearly shows a single task being divided for multi-processing.

Quite frankly, many of the examples are overly sophisticated thus making the flow tougher to play with.

Does anyone care to share their breakthrough sample or an example?

Vinay Sheshadri
  • 361
  • 4
  • 13

1 Answers1

3

Your basic example is this:

>>> import multiprocessing as mp
>>> from math import sqrt
>>> worker_pool = mp.Pool()
>>> jobs = [0, 1, 4, 9, 16, 25] 
>>>
>>> # calculate jobs in blocking batch parallel
>>> results = worker_pool.map(sqrt, jobs)
>>> results
[0.0, 1.0, 2.0, 3.0, 4.0, 5.0]
>>>
>>> # calculate jobs in asynchronous parallel
>>> results = worker_pool.map_async(sqrt, jobs)
>>> results.get()
[0.0, 1.0, 2.0, 3.0, 4.0, 5.0]
>>>
>>> # calculate jobs in parallel with an unordered iterator
>>> results = worker_pool.imap_unordered(sqrt, jobs)
>>> list(results)  # NOTE: results may return out of order
[0.0, 1.0, 2.0, 3.0, 4.0, 5.0]
>>>
>>> # a single blocking job on another process
>>> worker_pool.apply(sqrt, [9])
3.0
>>> # a single asynchronous job on another process
>>> y = worker_pool.apply_async(sqrt, [9])
>>> y.get()
3.0
>>> # the same interface exists for threads
>>> thread_pool = mp.dummy.Pool()
>>> thread_pool.map(sqrt, jobs)
[0.0, 1.0, 2.0, 3.0, 4.0, 5.0]
>>>
>>> # finishing up, you should shut down your pools
>>> worker_pool.close()
>>> worker_pool.join()
>>> thread_pool.close()
>>> thread_pool.join()

Examples can get more complex if you don't want batch parallel, but want something more complicated.

Mike McKerns
  • 33,715
  • 8
  • 119
  • 139