-1

I need to run the same script with many different sets of hyperparameters. I have far more sets than CPUs, so I want to do it in parallel, but I want it to schedule one run per CPU. Is there a simple way to do this in Python?

Sam Lerman
  • 301
  • 2
  • 8
  • can you please provide more information about your problem (code) and what you've tried? – jkr Jan 12 '21 at 23:30
  • Sure, I rephrased the question here: https://stackoverflow.com/questions/65694724/how-to-sweep-many-hyperparameter-sets-in-parallel-in-python – Sam Lerman Jan 13 '21 at 02:14

1 Answers1

2

Yes. You can use Python's multiprocessing module; in particular you can use multiprocessing.pool.

Assuming the work you want to do is encapsulated in a function called my_function and the parameters are in my_iterable, you can do something like:

from multiprocessing import Pool
with Pool() as processing_pool:
        processing_pool.map(my_function, my_iterable)
Jabrove
  • 718
  • 5
  • 13