I'm using python to run some test cases for a C++ app I'm working on. When I run it in a terminal I see regular output right up until an assert()
. With the following python code wrapping the call I miss a pile of stdout before the assert (yes, flush is not called, but why the difference between python and bash).
import subprocess as sp
cmd = ["myapp"]
proc = sp.Popen(cmd, stdout=sp.PIPE)
# runs in a thread while the main thread calls proc.wait(timeout)
for line in iter(proc.stdout.readline, b''):
print(line.decode('ascii'))
The only fix I've found is to add cmd = ['stdbuf', '-o0'] + cmd
(catching stdout...). Is there a python way to do whatever magic stdbuf
does?
- I'm already using readline - catching stdout..., read subprocess...
- Using
bufsize=0
or1
doesn't help - catching stdout... - I found it interesting that you can re-open a FD without buffering,
unbuffered = os.fdopen(proc.stdout.fileno(), 'rb', 0)
, but this didn't help either - unbuffered stdout in python - Much of the other buffering solutions I found were for python's output, e.g. disable output buffering, not reading stdout from the subprocess.
I noted that os.fdopen(proc.stdout.fileno(), 'r', 0)
(without the b
) produced ValueError: can't have unbuffered text I/O
, which makes me wonder if using universal_newlines=True
/text=True
implicitly enables buffering and maybe even readline()
is forced into it, but I'd have hoped that would happen within python and not in the pipe that gets destroyed when the subprocess crashes.