0

I understand that using subprocess is the preferred way to invoke external commands.

But what if I want to run multiple commands in parallel, but I want to limit the number of processes generated?

for i in os.listdir(output_lineEdit):  #用for循环遍历文件夹内所有文件
    if i.split(".")[1] == "ma":  # 用if筛选出ma文件
        mapath = i
        cmd = '"{mayaBatchPath}" -batch -file "{maPath}" -script "{melFile}" "{plugins}"'.format(
                mayaBatchPath=MAYABATCHPATH,
                melFile=melFile, #mel文件
                maPath=output_lineEdit+"\\"+mapath, #源maya文件路径
                plugins="-noAutoloadPlugins",
        )
        # print(output_lineEdit+"\\"+mapath)
        p = subprocess.Popen(cmd,
                             shell=True,
                             )

If I run it directly, I'll be running 18 mayabatch.exe programs at the same time because I have 18 files in my folder; What I want is to be able to control the number of programs I run, for example, only run two at a time, then close it and run the next one.

xh c
  • 15
  • 3
  • You can create a multiprocessing Pool with a finite size. Use subprocess.run for synchronous processing. See: https://docs.python.org/3/library/multiprocessing.html – DarkKnight Sep 08 '22 at 08:32
  • thanks,but I dont know how to use it,I dont want to set finite size,because use mayabatch.exe luanch ".ma" or ".mb" files need different amounts of memory,so i just need limit the subprocess. – xh c Sep 08 '22 at 08:45
  • My reference to a "finite size" was to the subprocess pool size and not anything to do with memory size – DarkKnight Sep 08 '22 at 08:55
  • the "finite size" is limit number of subprocess? – xh c Sep 08 '22 at 09:24
  • Yes - that's what I mean. Take a look at my answer – DarkKnight Sep 08 '22 at 09:29
  • Thank you for your help.I have a new question,How do I modify my code?I try repleace: – xh c Sep 08 '22 at 10:19
  • ```lang-py def run(v): for i in os.listdir(output_lineEdit): #use "for" list all files if i.split(".")[1] == "ma": # use "if" choose ".ma" files mapath = i cmd = '"{mayaBatchPath}" -batch -file "{maPath}" -script "{melFile}" "{plugins}"'.format( mayaBatchPath=MAYABATCHPATH,# mayabatch.exe path melFile=melFile, #.mel files maPath=output_lineEdit+"\\"+mapath, # maya source files plugins="-noAutoloadPlugins", ) subprocess.run(cmd, shell=True) ``` – xh c Sep 08 '22 at 10:20
  • This is indecipherable. I suggest you write a new question – DarkKnight Sep 08 '22 at 10:28
  • i write a new question,can you help me? https://stackoverflow.com/questions/73647739/python-problems-with-using-multiple-processes-to-process-files – xh c Sep 08 '22 at 10:43
  • Related: [How to limit the number of concurrent processes using subprocess module in asyncio python](https://stackoverflow.com/q/46031558/8746648) – asynts Sep 08 '22 at 10:57
  • Does this answer your question? [Restrict number of subprocess.Popen](https://stackoverflow.com/questions/60283704/restrict-number-of-subprocess-popen) – asynts Sep 08 '22 at 10:58
  • thanks everyone,I solved the problem,Especially thank Vlad – xh c Sep 08 '22 at 11:03

1 Answers1

1

By using subprocess.run and a multiprocessing Pool you can easily manage the number of concurrent subprocesses.

For example:

import subprocess
from multiprocessing import Pool

def run(v):
    print(v)
    subprocess.run('echo Hello; sleep 2; echo Done', shell=True)

def main():
    with Pool(5) as pool:
        pool.map(run, range(10))
    
if __name__ == '__main__':
    main()

In this trivial example we call the subprocess 10 times but the pool size is only 5 so there will never be more than 5 concurrent subprocesses.

Unless there are reasons why multithreading might not be appropriate then:

from concurrent.futures import ThreadPoolExecutor
import subprocess

def run(v):
    print(v)
    subprocess.run('echo Hello; sleep 2; echo Done', shell=True)

with ThreadPoolExecutor(5) as tpe:
    tpe.map(run, range(10))
DarkKnight
  • 19,739
  • 3
  • 6
  • 22
  • i think a multiprocessing pool is a bit overkill as you will be spawning 5 extra python interpreters, you can use ThreadPool for the exact same effect without using up 10s of MBs of RAM for the extra python processes. – Ahmed AEK Sep 08 '22 at 08:48
  • @AhmedAEK I'm lucky enough to use a "proper" operating system on a high capacity computer and tend to forget about people who try to run complex, high-intensity code on their laptops. The general principle of my answer still applies – DarkKnight Sep 08 '22 at 08:54