0

I'm working on a shared sandbox server running on Debian 3.16 , and i don't want to use all the CPU. So I've tried to limit the number of the processes by setting processes to 10. But it doesn't work, and it still uses all of avalaible CPU.

partial_run = functools.partial(function, **kwargs)

pool = multiprocessing.Pool(processes=10, maxtasksperchild=1)
res = pool.map(partial_run, product_set_list)
pool.close()  
pool.join()  

I think it's because I use numpy in the function, which performs operations outside the GIL. How can I limit the CPU usage?

  • `import mkl; mkl.set_num_threads(1)` might force numpy processes to stick to one thread only. – Hielke Walinga Jul 27 '18 at 10:11
  • @HielkeWalinga Does this have to occur before or after `import numpy` or does it not matter? – FlyingTeller Jul 27 '18 at 10:12
  • Good question. This answer provides more details and probably has a more reliable method: https://stackoverflow.com/a/48665619/8477066 If using that method it most likely does not matter. – Hielke Walinga Jul 27 '18 at 10:15
  • My version of numpy don't use MKL, but OpenBLASS. I limit the number of used CPU by setting the environnement variable OPENBLAS_NUM_THREADS to 1 – yellow_leprechaun Jul 27 '18 at 12:07

0 Answers0