16

I would like to run several commands, and capture all output to a logfile. I also want to print any errors to the screen (or optionally mail the output to someone).

Here's an example. The following command will run three commands, and will write all output (STDOUT and STDERR) into a single logfile.

{ command1 && command2 && command3 ; } > logfile.log 2>&1

Here is what I want to do with the output of these commands:

  • STDERR and STDOUT for all commands goes to a logfile, in case I need it later--- I usually won't look in here unless there are problems.
  • Print STDERR to the screen (or optionally, pipe to /bin/mail), so that any error stands out and doesn't get ignored.
  • It would be nice if the return codes were still usable, so that I could do some error handling. Maybe I want to send email if there was an error, like this:

    { command1 && command2 && command3 ; } > logfile.log 2>&1 || mailx -s "There was an error" stefanl@example.org

The problem I run into is that STDERR loses context during I/O redirection. A '2>&1' will convert STDERR into STDOUT, and therefore I cannot view errors if I do 2> error.log

Here are a couple juicier examples. Let's pretend that I am running some familiar build commands, but I don't want the entire build to stop just because of one error so I use the '--keep-going' flag.

{ ./configure && make --keep-going && make install ; } > build.log 2>&1

Or, here's a simple (And perhaps sloppy) build and deploy script, which will keep going in the event of an error.

{ ./configure && make --keep-going && make install && rsync -av --keep-going /foo devhost:/foo} > build-and-deploy.log 2>&1

I think what I want involves some sort of Bash I/O Redirection, but I can't figure this out.

Stefan Lasiewski
  • 17,380
  • 5
  • 28
  • 35

5 Answers5

19
(./doit >> log) 2>&1 | tee -a log

This will take stdout and append it to log file.

The stderr will then get converted to stdout which is piped to tee which appends it to the log (if you are have Bash 4, you can replace 2>&1 | with |&) and sends it to stdout which will either appear on the tty or can be piped to another command.

I used append mode for both so that regardless of which order the shell redirection and tee open the file, you won't blow away the original. That said, it may be possible that stderr/stdout is interleaved in an unexpected way.

R Samuel Klatchko
  • 74,869
  • 16
  • 134
  • 187
  • Your code works, but as soon as I run multiple commands like `( ./doit && ./doit2 >> log ) 2>&1 | tee -a log`, then stdout and stderr for both doit and doit2 is printed to the screen. – Stefan Lasiewski May 21 '10 at 00:37
  • This seems to work if I do something like this: `( { ./doit && ./stdout.sh ;} >> log) 2>&1 | tee -a log` – Stefan Lasiewski May 21 '10 at 00:38
  • 1
    @StefanLasiewski - yes, file redirection only applies to the command immediately before it. Your workaround to group the two command into a single subshell is the easiest fix; another uglier fix would be to redirect stdout for each command `( ./doit >> log && ./doit2 >> log ) 2>&1 | tee -a log` – R Samuel Klatchko May 21 '10 at 00:48
  • Shockingly, something like this seems to work also: `{ { date && ./doit && ./stdout.sh ; } >> log ; } 2>&1 | tee -a log` – Stefan Lasiewski May 21 '10 at 22:17
  • 3
    This does not preserve the ordering of the output. In my simple test case, `log` contained all lines that were printed to stdout first, then those printed to stderr; while the program printed them alternately. – Marian Feb 01 '15 at 20:24
  • Your code gives `Ambiguous output redirect` on cshell. But your suggestion for `Bash 4` worked for me on `csh` (tcsh 6.17.00): `( ./doit.csh >> log ) | & tee -a log`. Using only `>` instead of `>>` in `( ./doit.csh > log ) | & tee -a log` worked fine. – Eduardo Reis May 29 '16 at 03:41
2

If your system has /dev/fd/* nodes you can do it as:

( exec 5>logfile.txt ; { command1 && command2 && command3 ;} 2>&1 >&5 | tee /dev/fd/5 )

This opens file descriptor 5 to your logfile. Executes the commands with standard error directed to standard out, standard out directed to fd 5 and pipes stdout (which now contains only stderr) to tee which duplicates the output to fd 5 which is the log file.

Geoff Reedy
  • 34,891
  • 3
  • 56
  • 79
  • When I execute your command, my logfile only contains the error messages that I see on screen. stdout is lost. – tangens May 20 '10 at 06:40
  • This works also. However, I'm not sure I understand why. But I tried a command like this, and it works: `( exec 5>logfile.txt ; { date && ./doit && ./stdout.sh ;} 2>&1 >&5 | tee /dev/fd/5 )` – Stefan Lasiewski May 21 '10 at 22:35
1

Here is how to run one or more commands, capturing the standard output and error, in the order in which they are generated, to a logfile, and displaying only the standard error on any terminal screen you like. Works in bash on linux. Probably works in most other environments. I will use an example to show how it's done.

Preliminaries:

Open two windows (shells, tmux sessions, whatever)

I will demonstrate with some test files, so create the test files:

touch /tmp/foo /tmp/foo1 /tmp/foo2

in window1:

mkfifo /tmp/fifo

0</tmp/fifo cat - >/tmp/logfile

Then, in window2:

(ls -l /tmp/foo /tmp/nofile /tmp/foo1 /tmp/nofile /tmp/nofile; echo successful test; ls /tmp/nofile1111) 2>&1 1>/tmp/fifo | tee /tmp/fifo 1>/dev/pts/2

Where you replace /dev/pts/2 with whatever tty you want the stderr to display.

The reason for the various successful and unsuccessful commands in the subshell is simply to generate a mingled stream of output and error messages, so that you can verify the correct ordering in the log file. Once you understand how it works, replace the “ls” and “echo” commands with scripts or commands of your choosing.

With this method, the ordering of output and error is preserved, the syntax is simple and clean, and there is only a single reference to the output file. Plus there is flexiblity in putting the extra copy of stderr wherever you want.

Michael Martinez
  • 2,693
  • 1
  • 16
  • 19
  • This is essentially the same as R Samuel Klatchkos reply above. It doesn't preserve ordering as well. Try it with some simple script à la `echo out 1; echo err 2 >&2; echo out 3; echo err 4 >&2` instead of your ls stuff and you'll see. I assume your example works because while ls is starting up, tee gets some CPU cycles, but that's just pure luck, with some real-world this wouldn't work. – Marian Feb 01 '15 at 20:39
  • I'll test again tomorrow, but as I recall I already did some stringent tests first time around, including echo and other commands, with consistent results. – Michael Martinez Feb 02 '15 at 02:46
0

add this at the beginning of your script

#!/bin/bash
set -e 
outfile=logfile

exec > >(cat >> $outfile)
exec 2> >(tee -a $outfile >&2)

# write your code here

STDOUT and STDERR will be written to $outfile, only STDERR will be seen on the console

reto
  • 16,189
  • 7
  • 53
  • 67
0

Try:

command 2>&1 | tee output.txt

Additionally, you can direct stdout and stderr to different places:

command > stdout.txt >& stderr.txt

command > stdout.txt |& program_for_stderr

So some combination of the above should work for you -- e.g. you could save stdout to a file, and stderr to both a file and piping to another program (with tee).

Ether
  • 53,118
  • 13
  • 86
  • 159
  • 1
    Your first line is OK for sending stdout and stderr to both the terminal and the file, but the latter two lines won't work for sending them to different places. The `>stdout.txt` winds up having no effect because when you use `>&` or `|&` it redirects _both_ streams. – David Z May 20 '10 at 05:56
  • The first and third commands won't work, because stderr is redirected to stdout by `2>&1`. This is what I mean by "STDERR loses context during I/O redirection" above. The Bash Reference Manual says `|&` "is shorthand for `2>&1 |`. The second command doesn't print stderr to the screen. – Stefan Lasiewski May 21 '10 at 23:30