16

I was wondering if it was possible to tell bash that all calls to echo or printf should be followed up by a subsequent call to fflush() on stdout/stderr respectively?

A quick and dirty solution would be to write my own printf implementation that did this and use it in lieu of either built in, but it occurred to me that I might not need to.

I'm writing several build scripts that run at once, for debugging needs I really need to see messages that they write in order.

Tim Post
  • 33,371
  • 15
  • 110
  • 174
  • Not that I know of, I think that BASH is completely line based so they will flush on a line by line basis but no more. Is there a need to not flush on newlines? – Robert Massaioli Feb 26 '11 at 22:48
  • @Robert - Yes, I have to eliminate timing bugs as a culprit, thus I need to make sure every write to stderr / stdout causes the stream to be flushed. – Tim Post Feb 26 '11 at 22:59
  • Grasping at straws: I wonder if setting something using `stty` would help. Or using the `expect` script called `unbuffer`. Or setting Bash's `PS4` to include `$(date "+%s.%N")` (although there's a bit of overhead there) and using `set -x` and sorting the output. Perhaps something using `trap 'foo' DEBUG. Does [this](http://www.pixelbeat.org/programming/stdio_buffering/) provide any useful information? Or something [here](http://stackoverflow.com/questions/1507674/how-to-add-timestamp-to-stderr-redirection)? – Dennis Williamson Feb 27 '11 at 01:30
  • @Dennis - I appreciate the stab. The problem is, I'm collecting the dump from several background processes in one place and need to be sure that I'm seeing ordering issues vs lags due to buffering. – Tim Post Feb 28 '11 at 03:41
  • @Dennis - Additionally, I did look to see if `stty` offered something that might help (or might expose something that would), no luck. – Tim Post Feb 28 '11 at 03:44
  • Did you work out a solution to this? I need to see a logfile as its generated by a background process. – asheeshr Oct 07 '13 at 06:51
  • 1
    @AsheeshR Yes, see the accepted answer. – Tim Post Oct 07 '13 at 07:18
  • For files: http://stackoverflow.com/questions/1429951/force-flushing-of-output-to-a-file-while-bash-script-is-still-running – Ciro Santilli OurBigBook.com Apr 14 '16 at 09:23

3 Answers3

15

If comands use stdio and are connected to a terminal they'll be flushed per line. Otherwise you'll need to use something like stdbuf on commands in a pipe line http://www.pixelbeat.org/programming/stdio_buffering/

tl;dr: instead of printf ... try to put to the script stdbuf -o0 printf .., or stdbuf -oL printf ...

max630
  • 8,762
  • 3
  • 30
  • 55
pixelbeat
  • 30,615
  • 9
  • 51
  • 60
  • I was thinking of just implementing stdbuf independently, I'm dealing with a portability issue with systems that have an older version of coreutils (or, more simply `xxprintf` that just flushes the given stream / fd) – Tim Post Feb 28 '11 at 10:45
  • It seems like the only way to do this is with stdbuff, implementing it yourself if it is not present. – Tim Post Mar 15 '11 at 03:57
3

If you force the file to be read, it seems to cause the buffer to flush. These work for me.

Either read the data into a useless variable:

    x=$(<$logfile)

Or do a UUOC:

    cat $logfile > /dev/null
Dan Hale
  • 91
  • 2
1

Maybe "stty raw" can help with some other tricks for end-of-lines handling. AFAIK "raw" mode turns off line based buffering, at least when used for serial port ("stty raw < /dev/ttyS0").