I want to spawn multiple subprocesses and run them in parallel. I have a function which looks mostly like this:
def stream_command(command):
proc = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
while proc.poll() is None:
line = proc.stdout.readline()
sys.stdout.write('[%s]: %s' % (command, line))
return proc.poll()
I then can run multiple in parallel (roughly) with this:
def stream_commands(commands):
threads = []
for command in commands:
target = lambda: stream_command(command)
thread = Thread(target=target)
thread.start()
threads.append(thread)
while True:
if any(t.is_alive() for t in threads):
continue
else:
break
The issue, however, is that in my stream_command
function I am blocking on a call to proc.stdout.readline()
. That means a couple of things: first of all, if the process never writes to stdout
, that function will hang forever (even if the subprocess terminates, for example). Secondly, I can't separately respond to the stdout
and stderr
of the process (I would have to first blocking read to one, and then to the other... which would be very unlikely to work). What I would like to do is something akin to what I would write in node.js
:
def stream_command(command):
def on_stdout(line):
sys.stdout.write('[%s]: %s' % (command, line))
def on_stderr(line):
sys.stdout.write('[%s (STDERR)]: %s' % (command, line))
proc = asyncprocess.Popen(shlex.split(command),
on_stdout=on_stdout,
on_stderr=on_stderr
)
return proc.wait()
Where of course asyncprocess
is some fictitious process module that lets me start subprocesses and pass handler functions for stdout
and stderr
.
So, is there anything akin to the asyncprocess
module I have above, or failing that, is there any simple way to respond asynchronously to the events of a subprocess in python?
By the way, I should note that I'm using python 2.7. There seems to be some stuff for python3 via the asyncio
library, but unfortunately that doesn't work here, AFAIK.