2

I am running a command which will (very likely) output text to both stderr and stdout. I want to save both stderr and stdout to the same file, but I only want stderr printing to the terminal.

How can I get this to work? I've tried mycommand 1>&2 | tee file.txt >/dev/null but that doesn't print anything to the terminal.

leetbacoon
  • 1,111
  • 2
  • 9
  • 32
  • 1
    The thing that's tricky about this is retaining accurate ordering. Just doing the redirection itself is easy, and I'm pretty sure we already have it covered. – Charles Duffy Sep 02 '20 at 23:24
  • (...and by "tricky", I mean that if you need perfect ordering you need to use syscall-level tracing to reconstruct what order the writes _would have_ happened in had they not been redirected through pipelines of different lengths with no synchronization guarantees). – Charles Duffy Sep 02 '20 at 23:25
  • @CharlesDuffy Is it maybe possible with `while` and `read` to read the input line by line and determine which should be printed and which shouldn't? – leetbacoon Sep 02 '20 at 23:28
  • Each `read` operation reads from only one file descriptor. Even if you were using code written in a different language with a `select()` call or other means to listen on multiple FDs at once, though, you still wouldn't have a guarantee that your `select()` call would receive content in the same order that the underlying program wrote it. – Charles Duffy Sep 02 '20 at 23:30
  • (Similarly, each `write()` syscall writes _to_ only one file descriptor, which is the source of the problem. When stdout and stderr are two different copies of the same file descriptor, there exists an absolute ordering between writes; but as soon as you want them processed separately, they can't be copies of the same kernelspace object anymore, and ordering becomes undefined). – Charles Duffy Sep 02 '20 at 23:32
  • Actually, a question: Is the content the program you're running writes to stdout and stderr differentiate itself in a way that lets you know what's output and what's informational based on inspecting the content? In _that_ case, yes, a `while read` loop that routes things will work fine, since you can have both stdout and stderr pointing to a single file descriptor with that loop on the other end. – Charles Duffy Sep 02 '20 at 23:41

1 Answers1

4

If You Don't Need Perfect Ordering

Using two separate copies of tee, both writing to the same file in append mode but only one of them subsequently forwarding content to /dev/null, will get you where you need to be:

mycommand \
  2> >(tee -a file.txt >&2) \
   > >(tee -a file.txt >/dev/null)

If You Do Need Perfect Ordering

See Separately redirecting and recombining stderr/stdout without losing ordering

Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
  • Quite a shame, to say the least. I do need ordering in my case. Thank you regardless for your help. – leetbacoon Sep 02 '20 at 23:33
  • If the underlying program you're running is written in Java, log4j is your friend; if it's in Python, the `logging` standard library module, etc. If the program being run does the multiplexing itself, it can be responsible for retaining order internally, by having both content from log-stream-A and content from log-stream-B written to the same file descriptor whenever it's configured to be copied to the same file (in addition to other file descriptors for any alternate sinks). – Charles Duffy Sep 02 '20 at 23:36
  • ...but yes, that _is_ a shame; means that the simple/clean UNIX model often isn't enough. – Charles Duffy Sep 02 '20 at 23:37
  • I'm writing (well, just finishing up) a bash script, so I guess I was doomed from the start without knowing it :P – leetbacoon Sep 02 '20 at 23:40
  • If it's _your_ script, you have control over how it does logging. Make all your informational logs go through a function that can be configured (maybe w/ an environment variable, but your call on the details) to do two separate writes to different destinations, and you're good. – Charles Duffy Sep 02 '20 at 23:43