I need to run the same script with many different sets of hyperparameters. I have far more sets than CPUs, so I want to do it in parallel, but I want it to schedule one run per CPU. Is there a simple way to do this in Python?
Asked
Active
Viewed 77 times
-1
-
can you please provide more information about your problem (code) and what you've tried? – jkr Jan 12 '21 at 23:30
-
Sure, I rephrased the question here: https://stackoverflow.com/questions/65694724/how-to-sweep-many-hyperparameter-sets-in-parallel-in-python – Sam Lerman Jan 13 '21 at 02:14
1 Answers
2
Yes. You can use Python's multiprocessing
module; in particular you can use multiprocessing.pool
.
Assuming the work you want to do is encapsulated in a function called my_function
and the parameters are in my_iterable
, you can do something like:
from multiprocessing import Pool
with Pool() as processing_pool:
processing_pool.map(my_function, my_iterable)

Jabrove
- 718
- 5
- 13
-
But I need to run it 1000 times, not 40. How would I do that without overburdening the CPUs? – Sam Lerman Jan 13 '21 at 01:23
-
Also my file isn't a function, it's a python file with parsed arguments. Can I pass a file into multiproc? – Sam Lerman Jan 13 '21 at 01:27