660

Is output buffering enabled by default in Python's interpreter for sys.stdout?

If the answer is positive, what are all the ways to disable it?

Suggestions so far:

  1. Use the -u command line switch
  2. Wrap sys.stdout in an object that flushes after every write
  3. Set PYTHONUNBUFFERED env var
  4. sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)

Is there any other way to set some global flag in sys/sys.stdout programmatically during execution?


If you just want to flush after a specific write using print, see How can I flush the output of the print function?.

Karl Knechtel
  • 62,466
  • 11
  • 102
  • 153
Eli Bendersky
  • 263,248
  • 89
  • 350
  • 412

16 Answers16

514

From Magnus Lycka answer on a mailing list:

You can skip buffering for a whole python process using python -u or by setting the environment variable PYTHONUNBUFFERED.

You could also replace sys.stdout with some other stream like wrapper which does a flush after every call.

class Unbuffered(object):
   def __init__(self, stream):
       self.stream = stream
   def write(self, data):
       self.stream.write(data)
       self.stream.flush()
   def writelines(self, datas):
       self.stream.writelines(datas)
       self.stream.flush()
   def __getattr__(self, attr):
       return getattr(self.stream, attr)

import sys
sys.stdout = Unbuffered(sys.stdout)
print 'Hello'
Trevor Boyd Smith
  • 18,164
  • 32
  • 127
  • 177
Seb
  • 17,141
  • 7
  • 38
  • 27
  • 79
    Original sys.stdout is still available as sys.__stdout__. Just in case you need it =) – Antti Rasinen Sep 20 '08 at 09:26
  • This the solution that I used when I ran into problems with print statements being buffered. Worked like a charm. – Ryan Sep 22 '08 at 03:19
  • 52
    `#!/usr/bin/env python -u` doesn't work!! see [here](http://stackoverflow.com/q/3306518/674039) – wim Dec 10 '12 at 00:11
  • 7
    `__getattr__` just to avoid inheritance?! – Vladimir Keleshev Apr 24 '13 at 07:33
  • 34
    Some notes to save some headaches: As I noticed, output buffering works differently depending on if the output goes to a tty or another process/pipe. If it goes to a tty, then it is flushed after each *\n*, but in a pipe it is buffered. In the latter case you can make use of these flushing solutions. In Cpython (not in pypy!!!): If you iterate over the input with **for line in sys.stdin:** ... then the for loop will collect a number of lines before the body of the loop is run. This will behave like buffering, though it's rather batching. Instead, do **while true: line = sys.stdin.readline()** – tzp Jun 10 '13 at 12:35
  • 2
    So, guys, what are the consequences of disabling output buffering? When would you not want to? – Will Sep 02 '13 at 00:47
  • @Will That's a whole other question but one major benefit to buffering is performance - writing to a console is not particularly fast, so batching the writes reduces overhead. – Basic Sep 15 '13 at 04:44
  • 5
    @tzp: you could use `iter()` instead of the `while` loop: `for line in iter(pipe.readline, ''):`. You don't need it on Python 3 where `for line in pipe:` yields as soon as possible. – jfs Nov 29 '13 at 17:11
  • Theoretically if within the main script a module was being loaded after stderr has been set, and this loaded module was importing stderr, should it be redirecting to the newly set stderr from the original script? –  May 01 '14 at 07:50
  • 4
    @tzp : The differing behaviour is particularly infuriating when you are using `python myscript.py | tee logfile.txt` - the purpose being to see what you're doing while also logging it! – tehwalrus Sep 18 '14 at 08:52
  • Passing -u on the commandline does not help for me, adding `flush=True` to the print calls works though (Python 3.5 - Windows). – Zitrax Sep 22 '16 at 14:00
  • The following was omitted in the copy/paste from the original post: "I don't think it will work in `IDLE`, since `sys.stdout` is already replaced with some funny object there which doesn't like to be flushed. (This could be considered a bug in `IDLE` though.)" – akhan Dec 21 '16 at 09:23
  • 3
    @Halst No. Using `__getattr__` allows it to with with *any* stream, not just whatever particular class you intend to use. – jpmc26 Jan 02 '17 at 23:41
  • For more info on streams buffering, give this a read: https://eklitzke.org/stdout-buffering It mentions the behavior that tzp explains above. – Nico Villanueva Oct 12 '20 at 15:47
  • This is extremely useful solution if one hosts a CGI Python script on IIS! Thanks! Along with `responseBufferLimit="0"` in `web.config`, this piece of code removes all other buffering artifacts from the script's output – Sergey Nudnov Mar 11 '21 at 19:53
222

I would rather put my answer in How to flush output of print function? or in Python's print function that flushes the buffer when it's called?, but since they were marked as duplicates of this one (what I do not agree), I'll answer it here.

Since Python 3.3, print() supports the keyword argument "flush" (see documentation):

print('Hello World!', flush=True)
Cristóvão D. Sousa
  • 2,848
  • 1
  • 16
  • 14
90
# reopen stdout file descriptor with write mode
# and 0 as the buffer size (unbuffered)
import io, os, sys
try:
    # Python 3, open as binary, then wrap in a TextIOWrapper with write-through.
    sys.stdout = io.TextIOWrapper(open(sys.stdout.fileno(), 'wb', 0), write_through=True)
    # If flushing on newlines is sufficient, as of 3.7 you can instead just call:
    # sys.stdout.reconfigure(line_buffering=True)
except TypeError:
    # Python 2
    sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)

Credits: "Sebastian", somewhere on the Python mailing list.

Russell Davis
  • 8,319
  • 4
  • 40
  • 41
Federico A. Ramponi
  • 46,145
  • 29
  • 109
  • 133
  • In Python3 you can just override the name of the print function with a flushing one. Its a dirty trick though! – meawoppl Jan 22 '14 at 18:50
  • 22
    @meawoppl: you could pass`flush=True` parameter to `print()` function since Python 3.3. – jfs Aug 25 '15 at 09:23
  • Editing response to show response is not valid in recent version of python – Mike Dec 10 '18 at 23:51
  • both `os.fdopen(sys.stdout.fileno(), 'wb', 0)` (note the `b` for binary) and `flush=True` work for me in 3.6.4. However, if you're using *subprocess* to start another script, make sure you've specified `python3`, if you have multiple instances of python installed. – not2qubit Dec 13 '18 at 14:36
  • 1
    @not2qubit: if you use `os.fdopen(sys.stdout.fileno(), 'wb', 0)` you end up with a binary file object, not a `TextIO` stream. You'd have to add a `TextIOWrapper` to the mix (making sure to enable `write_through` to eliminate all buffers, or use `line_buffering=True` to only flush on newlines). – Martijn Pieters Nov 11 '19 at 11:55
  • 10
    If flushing on newlines is sufficient, as of Python 3.7 you can simply call `sys.stdout.reconfigure(line_buffering=True)` – Russell Davis Apr 17 '20 at 07:38
  • @RussellDavis I did this, but it seem like it doesn't stick to subsequent prints(). I still have to still use `print(...,flush=True)` on Py3.8. Any ideas? – not2qubit Dec 25 '20 at 00:37
73

Yes, it is.

You can disable it on the commandline with the "-u" switch.

Alternatively, you could call .flush() on sys.stdout on every write (or wrap it with an object that does this automatically)

Brian
  • 116,865
  • 28
  • 107
  • 112
39

This relates to Cristóvão D. Sousa's answer, but I couldn't comment yet.

A straight-forward way of using the flush keyword argument of Python 3 in order to always have unbuffered output is:

import functools
print = functools.partial(print, flush=True)

afterwards, print will always flush the output directly (except flush=False is given).

Note, (a) that this answers the question only partially as it doesn't redirect all the output. But I guess print is the most common way for creating output to stdout/stderr in python, so these 2 lines cover probably most of the use cases.

Note (b) that it only works in the module/script where you defined it. This can be good when writing a module as it doesn't mess with the sys.stdout.

Python 2 doesn't provide the flush argument, but you could emulate a Python 3-type print function as described here https://stackoverflow.com/a/27991478/3734258 .

Community
  • 1
  • 1
Tim
  • 802
  • 1
  • 7
  • 15
  • 2
    Except that there is no `flush` kwarg in python2. – o11c May 05 '17 at 05:19
  • @o11c , yes you're right. I was sure I tested it but somehow I was seemingly confused (: I modified my answer, hope it's fine now. Thanks! – Tim May 12 '17 at 10:41
15

The following works in Python 2.6, 2.7, and 3.2:

import os
import sys
buf_arg = 0
if sys.version_info[0] == 3:
    os.environ['PYTHONUNBUFFERED'] = '1'
    buf_arg = 1
sys.stdout = os.fdopen(sys.stdout.fileno(), 'a+', buf_arg)
sys.stderr = os.fdopen(sys.stderr.fileno(), 'a+', buf_arg)
Gummbum
  • 151
  • 1
  • 2
15
def disable_stdout_buffering():
    # Appending to gc.garbage is a way to stop an object from being
    # destroyed.  If the old sys.stdout is ever collected, it will
    # close() stdout, which is not good.
    gc.garbage.append(sys.stdout)
    sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)

# Then this will give output in the correct order:
disable_stdout_buffering()
print "hello"
subprocess.call(["echo", "bye"])

Without saving the old sys.stdout, disable_stdout_buffering() isn't idempotent, and multiple calls will result in an error like this:

Traceback (most recent call last):
  File "test/buffering.py", line 17, in <module>
    print "hello"
IOError: [Errno 9] Bad file descriptor
close failed: [Errno 9] Bad file descriptor

Another possibility is:

def disable_stdout_buffering():
    fileno = sys.stdout.fileno()
    temp_fd = os.dup(fileno)
    sys.stdout.close()
    os.dup2(temp_fd, fileno)
    os.close(temp_fd)
    sys.stdout = os.fdopen(fileno, "w", 0)

(Appending to gc.garbage is not such a good idea because it's where unfreeable cycles get put, and you might want to check for those.)

Mark Seaborn
  • 1,392
  • 13
  • 11
  • 3
    If the old `stdout` still lives on `sys.__stdout__` as some have suggested, the garbage thing won't be necessary, right? It's a cool trick though. – Thomas Ahle Feb 28 '14 at 10:17
  • 2
    As with @Federico's answer, this will not work with Python 3, as it will throw the exception `ValueError: can't have unbuffered text I/O` when calling `print()`. – gbmhunter Jul 18 '18 at 16:57
  • Your "another possibility" seems at first like the most robust solution, but unfortunately it suffers a race condition in the case that another thread calls open() after your sys.stdout.close() and before your os.dup2(temp_fd, fileno). I found this out when I tried using your technique under ThreadSanitizer, which does exactly that. The failure is made louder by the fact that dup2() fails with EBUSY when it races with open() like that; see https://stackoverflow.com/questions/23440216/race-condition-when-using-dup2 – Don Hatch Oct 30 '18 at 07:01
14

In Python 3, you can monkey-patch the print function, to always send flush=True:

_orig_print = print

def print(*args, **kwargs):
    _orig_print(*args, flush=True, **kwargs)

As pointed out in a comment, you can simplify this by binding the flush parameter to a value, via functools.partial:

print = functools.partial(print, flush=True)
Oliver
  • 27,510
  • 9
  • 72
  • 103
  • 3
    Just wondering, but wouldn't that be a perfect use case for `functools.partial`? – 0xC0000022L Jun 24 '19 at 11:09
  • Thanks @0xC0000022L, this makes it look better! `print = functools.partial(print, flush=True)` works fine for me. – MarSoft Aug 13 '19 at 12:04
  • @0xC0000022L indeed, I have updated the post to show that option, thanks for pointing that out – Oliver Aug 13 '19 at 14:57
  • 4
    If you want that to apply everywhere, `import builtins; builtins.print = partial(print, flush=True)` – Perkins Oct 29 '19 at 01:52
  • Oddly, this approach worked when nothing else did for Python 3.x, and I am wondering why the other documented approaches (use -u flag) do not work. – truedat101 Jul 06 '21 at 16:22
13

Yes, it is enabled by default. You can disable it by using the -u option on the command line when calling python.

Nathan
  • 11,938
  • 12
  • 55
  • 62
8

You can also run Python with stdbuf utility:

stdbuf -oL python <script>

dyomas
  • 700
  • 5
  • 13
  • 3
    Line buffering (as `-oL` enables) is still buffering -- see f/e https://stackoverflow.com/questions/58416853/why-end-make-the-output-disappear, asking why `end=''` makes output no longer be immediately displayed. – Charles Duffy Oct 16 '19 at 15:47
  • True, but line buffering is the default (with a tty) so does it make sense to write code assuming output is totally unbuffered — maybe better to explicitly `print(..., end='', flush=True)` where that's improtant? OTOH, when several programs write to same output concurrently, the trade-off tends to shift from seeing immediate progress to reducing output mixups, and line buffering becomes attractive. So maybe it _is_ better to not write explicit `flush` and control buffering externally? – Beni Cherniavsky-Paskin May 11 '20 at 09:35
  • I think, no. Process itself should decide, when and why it calls `flush`. External buffering control is compelled workaround here – dyomas May 13 '20 at 07:27
4

One way to get unbuffered output would be to use sys.stderr instead of sys.stdout or to simply call sys.stdout.flush() to explicitly force a write to occur.

You could easily redirect everything printed by doing:

import sys; sys.stdout = sys.stderr
print "Hello World!"

Or to redirect just for a particular print statement:

print >>sys.stderr, "Hello World!"

To reset stdout you can just do:

sys.stdout = sys.__stdout__
efotinis
  • 14,565
  • 6
  • 31
  • 36
stderr
  • 8,567
  • 1
  • 34
  • 50
  • 1
    This might get very confusing when you then later try to capture the output using standard redirection, and find you are capturing nothing! p.s. your __stdout__ is being bolded and stuff. – freespace Sep 20 '08 at 10:00
  • 2
    One big caution about selectively printing to stderr is that this causes the lines to appear out of place, so unless you also have timestamp this could get very confusing. – haridsv Oct 30 '11 at 18:13
4

You can create an unbuffered file and assign this file to sys.stdout.

import sys 
myFile= open( "a.log", "w", 0 ) 
sys.stdout= myFile

You can't magically change the system-supplied stdout; since it's supplied to your python program by the OS.

S.Lott
  • 384,516
  • 81
  • 508
  • 779
4

You can also use fcntl to change the file flags in-fly.

fl = fcntl.fcntl(fd.fileno(), fcntl.F_GETFL)
fl |= os.O_SYNC # or os.O_DSYNC (if you don't care the file timestamp updates)
fcntl.fcntl(fd.fileno(), fcntl.F_SETFL, fl)
jimx
  • 1,012
  • 2
  • 12
  • 12
  • 1
    There's a windows equivalent: http://stackoverflow.com/questions/881696/unbuffered-stdout-in-python-as-in-python-u-from-within-the-program/881751#881751 – Tobu Jan 23 '11 at 01:41
  • 15
    O_SYNC has nothing at all to do with userspace-level buffering that this question is asking about. – apenwarr Apr 25 '12 at 07:21
4

It is possible to override only write method of sys.stdout with one that calls flush. Suggested method implementation is below.

def write_flush(args, w=stdout.write):
    w(args)
    stdout.flush()

Default value of w argument will keep original write method reference. After write_flush is defined, the original write might be overridden.

stdout.write = write_flush

The code assumes that stdout is imported this way from sys import stdout.

Vasily E.
  • 53
  • 4
3

Variant that works without crashing (at least on win32; python 2.7, ipython 0.12) then called subsequently (multiple times):

def DisOutBuffering():
    if sys.stdout.name == '<stdout>':
        sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)

    if sys.stderr.name == '<stderr>':
        sys.stderr = os.fdopen(sys.stderr.fileno(), 'w', 0)
Laimis
  • 39
  • 1
3

(I've posted a comment, but it got lost somehow. So, again:)

  1. As I noticed, CPython (at least on Linux) behaves differently depending on where the output goes. If it goes to a tty, then the output is flushed after each '\n'
    If it goes to a pipe/process, then it is buffered and you can use the flush() based solutions or the -u option recommended above.

  2. Slightly related to output buffering:
    If you iterate over the lines in the input with

    for line in sys.stdin:
    ...

then the for implementation in CPython will collect the input for a while and then execute the loop body for a bunch of input lines. If your script is about to write output for each input line, this might look like output buffering but it's actually batching, and therefore, none of the flush(), etc. techniques will help that. Interestingly, you don't have this behaviour in pypy. To avoid this, you can use

while True: line=sys.stdin.readline()
...

tzp
  • 544
  • 7
  • 10
  • [here's your comment](http://stackoverflow.com/questions/107705/python-output-buffering/107717#comment24604506_107717). It might be a bug on older Python versions. Could you provide example code? Something like [`for line in sys.stdin`](http://ideone.com/TzHwlX) vs. [`for line in iter(sys.stdin.readline, "")`](http://ideone.com/mMxn09) – jfs Jun 19 '13 at 15:40
  • for line in sys.stdin: print("Line: " +line); sys.stdout.flush() – tzp Jun 21 '13 at 12:19
  • it looks like [the read-ahead bug](https://bugs.python.org/issue3907). It should only happen on Python 2 and if stdin is a pipe. The code in my previous comment demonstrates the issue (`for line in sys.stdin` provides a delayed response) – jfs Aug 25 '15 at 09:21