I want to start a subprocess in Python and output the subprocess's stdout and stderr in real time. With two or more pipes this can get difficult due the blocking nature of read()
as discussed here.
As shown in this answer this is possible, though:
FDs = [process.stdout.fileno(),
process.stderr.fileno()]
while True:
ret = select.select(FDs, [], [])
for fd in ret[0]:
if fd == process.stdout.fileno():
print(process.stdout.readline())
if fd == process.stderr.fileno():
print(process.stderr.readline())
And this is nearly what I'm looking for if not select()
was so damn slow. Compared to reading without select() (after process termination) the output is being slowed down by factor 50x or more (tested on find
).
If I tried to use select() but always read the whole buffer e.g. by using readlines()
I run into the first problem with blocking reads and reading one pipe will avoid reading the other.
I could start a reader thread for each pipe but this would ugly and complicated since I had to make sure the threads are being terminated correctly.
So my question is: do you know an elegant way to read from more than one pipe in the same thread? Maybe there is something like read_available_lines()
which would check the buffer size or something similar.