178

I am trying to write a wrapper script for a command line program (svnadmin verify) that will display a nice progress indicator for the operation. This requires me to be able to see each line of output from the wrapped program as soon as it is output.

I figured that I'd just execute the program using subprocess.Popen, use stdout=PIPE, then read each line as it came in and act on it accordingly. However, when I ran the following code, the output appeared to be buffered somewhere, causing it to appear in two chunks, lines 1 through 332, then 333 through 439 (the last line of output)

from subprocess import Popen, PIPE, STDOUT

p = Popen('svnadmin verify /var/svn/repos/config', stdout = PIPE, 
        stderr = STDOUT, shell = True)
for line in p.stdout:
    print line.replace('\n', '')

After looking at the documentation on subprocess a little, I discovered the bufsize parameter to Popen, so I tried setting bufsize to 1 (buffer each line) and 0 (no buffer), but neither value seemed to change the way the lines were being delivered.

At this point I was starting to grasp for straws, so I wrote the following output loop:

while True:
    try:
        print p.stdout.next().replace('\n', '')
    except StopIteration:
        break

but got the same result.

Is it possible to get 'realtime' program output of a program executed using subprocess? Is there some other option in Python that is forward-compatible (not exec*)?

Mad Physicist
  • 107,652
  • 25
  • 181
  • 264
Chris Lieb
  • 3,706
  • 7
  • 36
  • 48
  • 2
    Have you tried omitting the `sydout=PIPE` so the subprocess writes directly to your console, bypassing the parent process? – S.Lott Apr 29 '09 at 17:01
  • 7
    The thing is that I want to read the output. If it is output directly to the console, how could I do that? Also, I don't want the user to see the output from the wrapped program, just my output. – Chris Lieb Apr 29 '09 at 17:07
  • 2
    Then why a "real-time" display? I don't get the use case. – S.Lott Apr 29 '09 at 17:17
  • 9
    Don't use shell=True. It needlessy invokes your shell. Use p = Popen(['svnadmin', 'verify', '/var/svn/repos/config'], stdout=PIPE, stderr=STDOUT) instead – nosklo Apr 30 '09 at 12:19
  • 3
    @S.Lott Basically, svnadmin verify prints a line of output for every revision that is verified. I wanted to make a nice progress indicator that wouldn't cause excessive amounts of output. Kind of like wget, for example – Chris Lieb May 01 '09 at 00:26
  • @nosklo I tried omitting shell=True when I was working with this, but it would never execute without out it. I even used the full path to svnadmin in case PATH wasn't set if I used shell=False, but that didn't fix it either. – Chris Lieb May 01 '09 at 00:27
  • 1
    You need to split the command yourself in order to be able to omit the `shell=True` but that's a trivial modification. @nosklo's comment shows you how; pass the first argument as a list of tokens, not a single string. – tripleee Aug 15 '16 at 11:55

24 Answers24

92

I tried this, and for some reason while the code

for line in p.stdout:
  ...

buffers aggressively, the variant

while True:
  line = p.stdout.readline()
  if not line: break
  ...

does not. Apparently this is a known bug: http://bugs.python.org/issue3907 (The issue is now "Closed" as of Aug 29, 2018)

Penguin
  • 93
  • 2
  • 10
Dave
  • 10,369
  • 1
  • 38
  • 35
  • This is not the only mess in the old Python IO implementations. This is why Py2.6 and Py3k ended up with a completely new IO library. – Tim Lin Apr 30 '09 at 02:38
  • 12
    This code will break if the subprocess returns an empty line. A better solution would be to use `while p.poll() is None` instead of `while True`, and remove the `if not line` – exhuma Dec 22 '09 at 09:59
  • 9
    @exhuma: it works fine. readline returns "\n" on an empty line, which does not evaluate as true. it only returns an empty string when the pipe closes, which will be when the subprocess terminates. – Alice Purcell Apr 09 '10 at 12:24
  • 1
    @Dave For future ref: print utf-8 lines in py2+ with `print(line.decode('utf-8').rstrip())`. – Jonathan Komar Jun 20 '16 at 16:05
  • 3
    Also for having real realtime read of the output of the process you will need to tell python that you do NOT want any buffering. Dear Python just give me the output directly. And here is how: You need to set the environment variable `PYTHONUNBUFFERED=1` . This is especially useful for outputs which are infinite – George Pligoropoulos May 01 '18 at 22:24
  • What if the subprocess output an empty line in the middle? – Benjamin Du Aug 29 '19 at 01:04
41

By setting the buffer size to 1, you essentially force the process to not buffer the output.

p = subprocess.Popen(cmd, stdout=subprocess.PIPE, bufsize=1)
for line in iter(p.stdout.readline, b''):
    print line,
p.stdout.close()
p.wait()
shrewmouse
  • 5,338
  • 3
  • 38
  • 43
Corey Goldberg
  • 59,062
  • 28
  • 129
  • 143
  • 7
    What is this b'' about? – ManuelSchneid3r Jan 24 '19 at 09:49
  • 2
    @ManuelSchneid3r `iter(, )` creates an iterable using each output of the until it returns the (called `sentinel`). If you try to run `p.stdout.readline` many times you see that when it has nothing else to print it prints `b''`, and thus this is the appropriate sentinel to use in this case. – Soap Sep 07 '20 at 17:52
  • For me, the `p.stdout.close()` is needed to avoid "unclosed file io.TextIOWrapper" resource warnings in PyCharm – Dan Nolan Oct 04 '22 at 07:14
  • Do I have to write that for loop every time I want to execute a command? – David G Feb 09 '23 at 00:41
  • WARNING: this does not set the buffer size to 1 (byte). It sets the buffering mode to line-based. Also, it's against the specs to use that without `text=True` or `universal_newlines=True` – Hubert Grzeskowiak Aug 02 '23 at 06:34
29

You can direct the subprocess output to the streams directly. Simplified example:

subprocess.run(['ls'], stderr=sys.stderr, stdout=sys.stdout)
Aidan Feldman
  • 5,205
  • 36
  • 46
  • 1
    Does this allow you to also get the contents after the fact in `.communicate()`? Or are the contents lost to the parent stderr/stdout streams? – theferrit32 Apr 18 '19 at 18:55
  • Nope, no `communicate()` method on the returned [`CompletedProcess`](https://docs.python.org/3.7/library/subprocess.html#subprocess.CompletedProcess). Also, `capture_output` is mutually exclusive with `stdout` and `stderr`. – Aidan Feldman Apr 19 '19 at 03:03
  • 27
    This isn't "real-time", which is the whole point of this question. This waits until `ls` has finished running, and doesn't give you access to its output. (Also, the `stdout` and `stderr` keyword arguments are superfluous - you are simply specifying the default values explicitly.) – tripleee Oct 15 '20 at 08:31
29

You can try this:

import subprocess
import sys

process = subprocess.Popen(
    cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)

while True:
    out = process.stdout.read(1)
    if out == '' and process.poll() != None:
        break
    if out != '':
        sys.stdout.write(out)
        sys.stdout.flush()

If you use readline instead of read, there will be some cases where the input message is not printed. Try it with a command the requires an inline input and see for yourself.

Nadia Alramli
  • 111,714
  • 37
  • 173
  • 152
  • Yes, using readline() will stop printing (even with calling sys.stdout.flush()) – Mark Ma Sep 05 '13 at 04:04
  • 4
    Is this supposed to hang indefinitely? I would wish a given solution to also include boilerplate code for editing the loop when the initial subprocess is done. Sorry I no matter how many time I look into it, subprocess etcetera is something I just can't ever get to work. – ThorSummoner May 31 '14 at 23:31
  • 1
    Why test for '' when in Python we can just use if not out? – Greg Bell Apr 23 '15 at 20:54
  • 2
    this is the best solution for long-running jobs. but it should use is not None and not != None. You should not use != with None. – Cari Jul 29 '15 at 15:30
  • 1
    Is stderr also displayed by this? – Pieter Vogelaar Jun 18 '18 at 15:30
  • nice answer! However, I'd point out that `process.poll() != None` is not pythonic... please consider using `process.poll() is not None` – Mike Pennington Feb 27 '22 at 11:39
  • @PieterVogelaar To also print stderr, pass ```stderr=subprocess.STDOUT```. – Thanasis Mattas Apr 26 '22 at 10:03
  • must `out=b''` in Python3 – michael jie Jun 29 '22 at 10:42
  • @ThanasisMattas Unfortunatelly you cannot do this, as you'll risk getting a deadlock. That's why communicate is suggested to be used by the docs, but it prohibits you from reading the output in real time... – ashrasmun Aug 25 '22 at 11:53
  • Is there a way of using this while preserving the stderr and leading it to the parent process' `sys.stderr`? – Hubert Grzeskowiak Jul 28 '23 at 03:29
  • Also, this approach seems to potentially cost a lot, since we are polling the child process and flushing the output after every single character. Am I missing something? – Hubert Grzeskowiak Jul 28 '23 at 03:29
22

In Python 3.x the process might hang because the output is a byte array instead of a string. Make sure you decode it into a string.

Starting from Python 3.6 you can do it using the parameter encoding in Popen Constructor. The complete example:

process = subprocess.Popen(
    'my_command',
    stdout=subprocess.PIPE,
    stderr=subprocess.STDOUT,
    shell=True,
    encoding='utf-8',
    errors='replace'
)

while True:
    realtime_output = process.stdout.readline()

    if realtime_output == '' and process.poll() is not None:
        break

    if realtime_output:
        print(realtime_output.strip(), flush=True)

Note that this code redirects stderr to stdout and handles output errors.

pavelicii
  • 1,590
  • 11
  • 21
13

Real Time Output Issue resolved: I encountered a similar issue in Python, while capturing the real time output from C program. I added fflush(stdout); in my C code. It worked for me. Here is the code.

C program:

#include <stdio.h>
void main()
{
    int count = 1;
    while (1)
    {
        printf(" Count  %d\n", count++);
        fflush(stdout);
        sleep(1);
    }
}

Python program:

#!/usr/bin/python

import os, sys
import subprocess


procExe = subprocess.Popen(".//count", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)

while procExe.poll() is None:
    line = procExe.stdout.readline()
    print("Print:" + line)

Output:

Print: Count  1
Print: Count  2
Print: Count  3
tripleee
  • 175,061
  • 34
  • 275
  • 318
sairam
  • 139
  • 1
  • 3
  • 1
    This was the only thing which actually helped. I used the same code (```flush(stdout)```) in C++. Thanks! – Johann Hagerer May 15 '18 at 09:33
  • 1
    I was having the same problem with a python script calling another python script as a subprocess. On the subprocess prints, "flush" was necessary (print("hello", flush=True) in python 3). Also, lots of examples over there are still (2020) python 2, this is python 3, so +1 – smajtkst Mar 20 '20 at 16:42
  • for python3+, change `line = procExe.stdout.readline()` to `line = procExe.stderr.readline()` – robo-monk Dec 28 '20 at 00:57
  • This worked great! had to edit to this for it to work `procExe = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True)` – Arye P. Mar 18 '23 at 17:59
8

The Streaming subprocess stdin and stdout with asyncio in Python blog post by Kevin McCarthy shows how to do it with asyncio:

import asyncio
from asyncio.subprocess import PIPE
from asyncio import create_subprocess_exec


async def _read_stream(stream, callback):
    while True:
        line = await stream.readline()
        if line:
            callback(line)
        else:
            break


async def run(command):
    process = await create_subprocess_exec(
        *command, stdout=PIPE, stderr=PIPE
    )

    await asyncio.wait(
        [
            _read_stream(
                process.stdout,
                lambda x: print(
                    "STDOUT: {}".format(x.decode("UTF8"))
                ),
            ),
            _read_stream(
                process.stderr,
                lambda x: print(
                    "STDERR: {}".format(x.decode("UTF8"))
                ),
            ),
        ]
    )

    await process.wait()


async def main():
    await run("docker build -t my-docker-image:latest .")


if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
Pablo
  • 983
  • 10
  • 24
  • 1
    Hi @Jeef can you point out the fix so I can update the answer? – Pablo Jun 11 '19 at 10:47
  • 1
    Hi, that worked for me but I had to add the following to get rid of some error messages: `import nest_asyncio; nest_asyncio.apply()` and to use shell command, i.e. `process = await create_subprocess_shell(*command, stdout=PIPE, stderr=PIPE, shell=True)` instead of `process = await create_subprocess_exec(...)`. Cheers! – user319436 Sep 26 '19 at 03:36
  • May as well just use a socket and save yourself the trouble. – ajsp Apr 23 '22 at 10:26
5

Found this "plug-and-play" function here. Worked like a charm!

import subprocess

def myrun(cmd):
    """from
    http://blog.kagesenshi.org/2008/02/teeing-python-subprocesspopen-output.html
    """
    p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,
                         stderr=subprocess.STDOUT)
    stdout = []
    while True:
        line = p.stdout.readline()
        stdout.append(line)
        print line,
        if line == '' and p.poll() != None:
            break
    return ''.join(stdout)
Mr_and_Mrs_D
  • 32,208
  • 39
  • 178
  • 361
Deena
  • 5,925
  • 6
  • 34
  • 40
  • 1
    The addition of `stderr=subprocess.STDOUT` actually helps a lot in capturing streaming data. I am upvoting it. – khan May 10 '17 at 00:59
  • 1
    The main beef here seems to come from the [accepted answer](https://stackoverflow.com/a/803421/874188) – tripleee Jan 10 '18 at 10:17
5

Depending on the use case, you might also want to disable the buffering in the subprocess itself.

If the subprocess will be a Python process, you could do this before the call:

os.environ["PYTHONUNBUFFERED"] = "1"

Or alternatively pass this in the env argument to Popen.

Otherwise, if you are on Linux/Unix, you can use the stdbuf tool. E.g. like:

cmd = ["stdbuf", "-oL"] + cmd

See also here about stdbuf or other options.

(See also here for the same answer.)

Albert
  • 65,406
  • 61
  • 242
  • 386
3

I used this solution to get realtime output on a subprocess. This loop will stop as soon as the process completes leaving out a need for a break statement or possible infinite loop.

sub_process = subprocess.Popen(my_command, close_fds=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)

while sub_process.poll() is None:
    out = sub_process.stdout.read(1)
    sys.stdout.write(out)
    sys.stdout.flush()
  • 5
    is it possible that this will exit the loop without the stdout buffer being empty? – jayjay Jul 21 '14 at 09:16
  • I have looked a lot for a suitable answer that didn't hang upon completion! I found this as a solution by adding `if out=='': break` after `out = sub_process...` – Sos Jul 23 '19 at 14:17
3

This is the basic skeleton that I always use for this. It makes it easy to implement timeouts and is able to deal with inevitable hanging processes.

import subprocess
import threading
import Queue

def t_read_stdout(process, queue):
    """Read from stdout"""

    for output in iter(process.stdout.readline, b''):
        queue.put(output)

    return

process = subprocess.Popen(['dir'],
                           stdout=subprocess.PIPE,
                           stderr=subprocess.STDOUT,
                           bufsize=1,
                           cwd='C:\\',
                           shell=True)

queue = Queue.Queue()
t_stdout = threading.Thread(target=t_read_stdout, args=(process, queue))
t_stdout.daemon = True
t_stdout.start()

while process.poll() is None or not queue.empty():
    try:
        output = queue.get(timeout=.5)

    except Queue.Empty:
        continue

    if not output:
        continue

    print(output),

t_stdout.join()
Badslacks
  • 69
  • 4
3

I ran into the same problem awhile back. My solution was to ditch iterating for the read method, which will return immediately even if your subprocess isn't finished executing, etc.

Eli Courtwright
  • 186,300
  • 67
  • 213
  • 256
2

You may use an iterator over each byte in the output of the subprocess. This allows inline update (lines ending with '\r' overwrite previous output line) from the subprocess:

from subprocess import PIPE, Popen

command = ["my_command", "-my_arg"]

# Open pipe to subprocess
subprocess = Popen(command, stdout=PIPE, stderr=PIPE)


# read each byte of subprocess
while subprocess.poll() is None:
    for c in iter(lambda: subprocess.stdout.read(1) if subprocess.poll() is None else {}, b''):
        c = c.decode('ascii')
        sys.stdout.write(c)
sys.stdout.flush()

if subprocess.returncode != 0:
    raise Exception("The subprocess did not terminate correctly.")
rhyno183
  • 21
  • 2
2

if you just want to forward the log to console in realtime

Below code will work for both

 p = subprocess.Popen(cmd,
                         shell=True,
                         cwd=work_dir,
                         bufsize=1,
                         stdin=subprocess.PIPE,
                         stderr=sys.stderr,
                         stdout=sys.stdout)
timger
  • 944
  • 2
  • 13
  • 31
  • 2
    This is an unholy mix of unnecessary complications. Just don't specify anything for `stderr` and `stdout` if you don't want to change where they are being sent. `cwd=work_dir` and `shell=True` seem out of place here, and `bufsize=1` seems vaguely dubious, especially without any explanation. – tripleee May 31 '21 at 06:13
1

Complete solution:

import contextlib
import subprocess

# Unix, Windows and old Macintosh end-of-line
newlines = ['\n', '\r\n', '\r']
def unbuffered(proc, stream='stdout'):
    stream = getattr(proc, stream)
    with contextlib.closing(stream):
        while True:
            out = []
            last = stream.read(1)
            # Don't loop forever
            if last == '' and proc.poll() is not None:
                break
            while last not in newlines:
                # Don't loop forever
                if last == '' and proc.poll() is not None:
                    break
                out.append(last)
                last = stream.read(1)
            out = ''.join(out)
            yield out

def example():
    cmd = ['ls', '-l', '/']
    proc = subprocess.Popen(
        cmd,
        stdout=subprocess.PIPE,
        stderr=subprocess.STDOUT,
        # Make all end-of-lines '\n'
        universal_newlines=True,
    )
    for line in unbuffered(proc):
        print line

example()
Andres Restrepo
  • 389
  • 5
  • 6
  • 1
    Since you're using `universal_newlines=True` on the `Popen()` call, you probably don't need to put your own handling of them in, too -- that's the whole point of the option. – martineau Aug 20 '13 at 12:15
  • 1
    it seems unnecessary complicated. It doesn't solve buffering issues. See [links in my answer](http://stackoverflow.com/a/17698359/4279). – jfs Oct 16 '14 at 20:05
  • This is the only way I could get rsync progress output in realtime(--outbuf=L) ! thanks – Mohammadhzp Aug 25 '15 at 15:29
1

Using pexpect with non-blocking readlines will resolve this problem. It stems from the fact that pipes are buffered, and so your app's output is getting buffered by the pipe, therefore you can't get to that output until the buffer fills or the process dies.

tripleee
  • 175,061
  • 34
  • 275
  • 318
Gabe
  • 1,028
  • 1
  • 10
  • 11
1

Here is what worked for me:

import subprocess
import sys

def run_cmd_print_output_to_console_and_log_to_file(cmd, log_file_path):
    make_file_if_not_exist(log_file_path)
    logfile = open(log_file_path, 'w')

    proc=subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell = True)
    for line in proc.stdout:
        sys.stdout.write(line.decode("utf-8") )
        print(line.decode("utf-8").strip(), file=logfile, flush=True)
    proc.wait()

    logfile.close()
perry_the_python
  • 385
  • 5
  • 14
1

These are all great examples, but I've found they either (a) handle partial lines (eg "Are you sure (Y/n):") but are really slow or b) are quick but hang on partial lines.

I've worked on the following which:

  • provides real-time output for both stdout and stderr to their respective streams
  • is extremely fast as it works with stream buffering
  • allows for using timeouts as it never blocks on read()
  • efficiently saves stdout and stderr independently
  • handles text encoding (though easily adaptable to binary streams)
  • works on Python 3.6+
import os
import subprocess
import sys
import selectors
import io

def run_command(command: str) -> (int, str):

    proc = subprocess.Popen(
        command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE
    )

    sel = selectors.DefaultSelector()
    for fobj in [ proc.stdout, proc.stderr ]:
        os.set_blocking(fobj.fileno(), False)
        sel.register(fobj, selectors.EVENT_READ)

    out=io.StringIO()
    err=io.StringIO()

    # loop until all descriptors removed
    while len(sel.get_map()) > 0:
        events = sel.select()
        if len(events) == 0:
            # timeout or signal, kill to prevent wait hanging
            proc.terminate()
            break
        for key, _ in events:
            # read all available data
            buf = key.fileobj.read().decode(errors='ignore')
            if buf == '':
                sel.unregister(key.fileobj)
            elif key.fileobj == proc.stdout:
                sys.stdout.write(buf)
                sys.stdout.flush()
                out.write(buf)
            elif key.fileobj == proc.stderr:
                sys.stderr.write(buf)
                sys.stderr.flush()
                err.write(buf)

    sel.close()
    proc.wait()
    if proc.returncode != 0:
        return (proc.returncode, err.getvalue())
    return (0, out.getvalue())

I didn't include the timeout logic (as the subject is real-time output), but it's simple to add them to select()/wait() and no longer worry about infinite hangs.

I've timed cat '25MB-file' and compared to the .read(1) solutions, it's roughly 300 times faster.

0

(This solution has been tested with Python 2.7.15)
You just need to sys.stdout.flush() after each line read/write:

while proc.poll() is None:
    line = proc.stdout.readline()
    sys.stdout.write(line)
    # or print(line.strip()), you still need to force the flush.
    sys.stdout.flush()
dan
  • 11
  • 3
0

Few answers suggesting python 3.x or pthon 2.x , Below code will work for both.

 p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,)
    stdout = []
    while True:
        line = p.stdout.readline()
        if not isinstance(line, (str)):
            line = line.decode('utf-8')
        stdout.append(line)
        print (line)
        if (line == '' and p.poll() != None):
            break
Djai
  • 188
  • 10
0
def run_command(command):
process = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
while True:
    output = process.stdout.readline()
    if output == '' and process.poll() is not None:
        break
    if output:
        print(output.strip())
rc = process.poll()
return rc
0

Yet another answer! I had the following requirements:

  • Run some command and print the output to stdout as though the user ran it
  • Display to the user any prompts from the command. E.g. pip uninstall numpy will prompt with ... Proceed (Y/n)? (which does not end in a newline)
  • Capture the output (that the user saw) as a string

This worked for me (only tested in Python 3.10 on Windows):

def run(*args: list[str]) -> str:
    proc = subprocess.Popen(
        *args,
        text=True,
        stdout=subprocess.PIPE,
        stderr=subprocess.STDOUT,
    )

    result = ""

    while proc.poll() is None:
        output = proc.stdout.read(1)

        if output:
            sys.stdout.write(output)
            sys.stdout.flush()
            result += output

    return result
David Gilbertson
  • 4,219
  • 1
  • 26
  • 32
0

Here is my solution:

process = subprocess.Popen(command, stdout=PIPE, stderr=PIPE)

error_output = ""

while True:

    # The empty string is important to fulfill the exit condition (see below)
    stdout_line = ""
    if process.stdout:
        stdout = process.stdout.readline()
        if stdout:
            stdout_line = stdout.decode("utf-8")
            log.debug(stdout_line)

    # The empty string is important to fulfill the exit condition (see below)
    stderr_line = ""
    if process.stderr:
        stderr = process.stderr.readline()
        if stderr:
            stderr_line = stderr.decode("utf-8")
            error_output += stderr_line
            log.debug(stderr_line)

    # It might be the case that the process is finished but reading the
    # output is not finished. This is why we check both conditions:
    # Condition for readline:
    #   https://docs.python.org/3.6/tutorial/inputoutput.html#methods-of-file-objects
    # Condition for poll:
    #   https://docs.python.org/3/library/subprocess.html#subprocess.Popen.poll
    if stdout_line == "" and stderr_line == "" and process.poll() != None:
        break

if process.returncode != 0:
    raise Exception(error_output)
schirrmacher
  • 2,341
  • 2
  • 27
  • 29
0

Late answer, but the following works for Python3:

import subprocess
import sys

process = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

while True:
    out = process.stdout.read(1)
    if process.poll() is not None:
        break
    if out != '':
        sys.stdout.buffer.write(out)
        sys.stdout.flush()
Pedro Lobito
  • 94,083
  • 31
  • 258
  • 268