141

I am trying to redirect all output from a command line programme to a file. I am using Bash. Some of the output is directed to a the file, but some still appears in the terminal and is not stored to the file.

Similar symptoms are described here:

Redirect all output to file

However I have tried the proposed solution (capture stderr) without success:

<cmd> <args> > stdout.txt 2> stderr.txt

The file stderr.txt is created but is empty.

A possible clue is that the command-line programme is a client communicating with a server on the same machine. It may be that some of the output is coming from the server.

Is there a way to capture all the output from the terminal, irrespective of its origin?

EDIT:

I've confirmed that the missing output is generated by the server. Running the command in a separate terminal causes some output in both terminals, I can pipe all the output from the command terminal to a file. This raises issues about how to capture the server output, but that's a different question.

jww
  • 97,681
  • 90
  • 411
  • 885
Stefan
  • 8,819
  • 10
  • 42
  • 68
  • 4
    How would the server write to your terminal without going through the client? Can you be more specific about what the command is? What you have there should be working fine. – Carl Norum May 30 '13 at 17:06
  • 1
    If the program directly writes to e.g. `/dev/tty`, instead of one of the standard output streams, there's no (simple) way to capture that. There's also the possibility that it might be duplicating the stdout/stderr file descriptors to another file descriptor and writing there, which you could capture (e.g. `... 3> somefile`), but you would have to know what file descriptor is being used... – twalberg Aug 01 '14 at 16:45
  • The actual problem of missing/partial output may due to output buffer. see this answer: https://stackoverflow.com/a/70209496/8428146 – Y00 Nov 17 '22 at 05:51

6 Answers6

170

you can use this syntax to redirect all output stderr and stdout to stdout.txt

<cmd> <args> > allout.txt 2>&1 
developer
  • 4,744
  • 7
  • 40
  • 55
  • 3
    This naive approach will only capture *standard in* and *standard out*. There can be others. For instance password prompts. For this reason the selected answer is more correct. I won't downvote you though :) – robert Dec 10 '14 at 17:02
  • I think it's important to add that this is all related to the broad topic of [I/O Redirection in BASH](http://www.tldp.org/LDP/abs/html/io-redirection.html) – Jake88 Dec 15 '14 at 20:10
  • For this solution also the order of redirection is important. Doing the 2>&1 first does not seem to work. – a1an Mar 05 '18 at 15:30
  • 1
    This might be the "naive approach" but the accepted answer missed critical stderr output when I tried it on Ubuntu. This answer worked for me so it gets my naive upvote. – KoZm0kNoT Sep 02 '22 at 19:31
156

Though not POSIX, bash 4 has the &> operator:

command &> alloutput.txt

Patrick Pijnappel
  • 7,317
  • 3
  • 39
  • 39
77

If the server is started on the same terminal, then it's the server's stderr that is presumably being written to the terminal and which you are not capturing.

The best way to capture everything would be to run:

script output.txt

before starting up either the server or the client. This will launch a new shell with all terminal output redirected out output.txt as well as the terminal. Then start the server from within that new shell, and then the client. Everything that you see on the screen (both your input and the output of everything writing to the terminal from within that shell) will be written to the file.

When you are done, type "exit" to exit the shell run by the script command.

Brian Campbell
  • 322,767
  • 57
  • 360
  • 340
  • 4
    @yanbellavance Usually, the way to indicate that is to simply upvote the answer that you prefer. Downvoting is only supposed to be used for answers that are not helpful at all. Furthermore, for this question, linuxcdeveloper's answer (linuxcdeveloper is the person who answered, Urda just edited the answer) did not actually work, as Stefan's problem was that the output was coming from two different processes; using "script" is the best way to capture all of the output for a session, no matter which process it comes from. – Brian Campbell Mar 20 '14 at 21:56
  • Here here, "script" is the best way to answer OP's specific question. It will literally capture *everything*! – robert Dec 10 '14 at 17:02
  • 1
    or `script -c yourcommand output.txt` – wolfrevo Nov 01 '22 at 12:36
6

I had trouble with a crashing program *cough PHP cough* Upon crash the shell it was ran in reports the crash reason, Segmentation fault (core dumped)

To avoid this output not getting logged, the command can be run in a subshell that will capture and direct these kind of output:

sh -c 'your_command' > your_stdout.log 2> your_stderr.err
# or
sh -c 'your_command' > your_stdout.log 2>&1
Community
  • 1
  • 1
ThorSummoner
  • 16,657
  • 15
  • 135
  • 147
6

You can execute a subshell and redirect all output while still putting the process in the background:

( ./script.sh blah > ~/log/blah.log 2>&1 ) &
echo $! > ~/pids/blah.pid
chovy
  • 72,281
  • 52
  • 227
  • 295
2

Proper answer is here: http://scratching.psybermonkey.net/2011/02/ssh-how-to-pipe-output-from-local-to.html

your_command | ssh username@server "cat > filename.txt"
Morlock
  • 6,880
  • 16
  • 43
  • 50