0

I need to run subprocess commands in for loop parallelly without interrupting one another. I have more than 100 shell commands. Some run for a short period and some take time. I don't want to wait for long-running commands. Below are the example. "cmds" is a list of commands

for cmd in cmds:
    push=subprocess.Popen(cmd, shell=True,stdout = subprocess.PIPE)
    push.wait()
    print(push.communicate()[0])
Shubham Sharma
  • 313
  • 3
  • 10

2 Answers2

2

Use a process pool, specify how many processes should run in parallel and let the pool handle the job scheduling:

from multiprocessing import Pool

def run_command(cmd):
    push=subprocess.Popen(cmd, shell=True,stdout = subprocess.PIPE)
    push.wait()
    return push.communicate()[0]

pool = Pool(processes=8)
results = pool.map(run_command, cmds)

for result in results:
    print(result)
Jan Wilamowski
  • 3,308
  • 2
  • 10
  • 23
0

Through this we can run multiple commands in async way

import subprocess
import asyncio
import time

def background(f):
    def wrapped(*args, **kwargs):
        return asyncio.get_event_loop().run_in_executor(None, f, *args, **kwargs)

    return wrapped

@background
def func1():
    cmd = 'echo "fun1 1"; sleep 5; echo "func1 2"; sleep 5; echo "func1 3"'
    subprocess.call(cmd, shell=True)

@background
def func2():
    cmd = 'echo "fun2 1"; sleep 5; echo "func2 2"; sleep 5; echo "func2 3"'
    subprocess.call(cmd, shell=True)


func1()
func2()

print('loop finished')
Shubham Sharma
  • 313
  • 3
  • 10