0

I need to call an API around 1000 times to run a specific function: the problem is each API call takes around 1 second, so running it synchronously is a no no. Running it asynchronously works, but, my mac is capped at running 100 processes in the module multiprocessing.pool(100).

from multiprocessing import Pool
pool = Pool()
results = pool.map(multi_run_wrapper, list_args)

So it's nothing complicated by any means, but this falls apart if len(list_args)>100.

Does anyone have any solutions? I need something to work asynchronously, be time efficient and run in parallel.

Josh Karpel
  • 2,110
  • 2
  • 10
  • 21
  • 6
    Don't start more processes than you have theoretical cores. I doubt your mac has 100 theoretical cores. Set your pool size as `multiprocessing.cpu_count()-1` – roganjosh Sep 05 '17 at 22:10
  • 2
    It sounds like you probably want mtuli-threads, not multi-processes. More discussion [here](https://stackoverflow.com/questions/3044580/multiprocessing-vs-threading-python). – Josh Karpel Sep 05 '17 at 22:11

0 Answers0