16

I have been following through some of the similar questions (such as How to set a variable to the output from a command in Bash?), however the accepted answers seem to be non-working for me. I wasn't sure whether I ought to derail someone else's question or post my own duplicate, so apologies if I chose wrong here.

I wish to get the output and exit status of a number of commands in a script I am putting together. Here is an example of what I have been using:

cmd_output=$(rm $file)
exit_status=$?
if [ "${exit_status}" -eq 0 ]
then
log "Successfully removed the original" ${TAB_LEVEL}
else
fail "Failed to remove the original, the output was: \n ${cmd_output}"
fi

The log and fail functions are:

# Usage: fail "Failure message"
function fail {
echo "FATAL ERROR: $1" >> "${LOG_DIR}/${LOG_FILE}"
exit 1
}

# Usage: log "Log message" 3    Where the tab-level is 3.
function log {
if (("${2}" > 0))
then
eval "printf '    %.0s' {1..$2}" >> "${LOG_DIR}/${LOG_FILE}"
fi
echo "$1" >> "${LOG_DIR}/${LOG_FILE}"
return 0
}

In the example above I use the $(cmd) format, but I have also tried using backticks.

In my log file, all I see when there is a failure is:

FATAL ERROR: Failed to remove the original, the output was: \n

Also, the output of the failed commands ends up on screen as per usual. Is there a common reason that my cmd_output variables would be remaining empty?

Community
  • 1
  • 1
Chris O'Kelly
  • 1,863
  • 2
  • 18
  • 35

3 Answers3

37

You have to include the output of the special standard error output stream:

cmd_output=$(rm "$file" 2>&1)

There are three default streams on every program (that are numbered file descriptors):

0. Standard input (where the program normally reads from)
1. Standard output (where the program normally writes to)
2. Standard error (where the program normally writes error messages)

So to capture the error messages, we must redirect standard error output (stderr) into normal standard output (stdout) which will then be captured by the $(...) expression.

The syntax for redirection is through the > "operator". Immediately before it you tell which file descriptor to redirect (the default is 1, which is stdout). And you can specify it to redirect to a file. If you write an ampersand (&) after it, you force it to redirect into another file descriptor. Therefore, in this example, we redirect file descriptor 2 (stderr) into file descriptor 1 (stdout).

Also, you can also redirect input with < "operator", but in this case the default file descriptor is 0 (stdin).

Another observation is that it is probably good practice to place your $file variable between double quotes, in case it has white space characters.

Hope this helps a little =)

  • Hey Janito, Thanks very much for your answer! I am aware of the three standard streams and redirection operators, but the only reason I have used them so far is to either send everything from a command to /dev/null or to a log file. I didn't even consider the different streams here. The $file variable was actually originally in quotes, I took it out of quotes when I was testing the error catching functions of my script, to force an error. Again, thanks very much! – Chris O'Kelly Oct 05 '12 at 02:25
  • I was wondering why my script has not worked. `2>&1` has done the trick for me. Thanks – Michael Dec 16 '20 at 13:12
6

*nix command generally have two forms of output: standard output (stdout) and standard error (stderr).

FOO=$(...) only captures stdout, and leaves stderr unhindered.

If you want the contents of stderr with this syntax, you need to postfix your command with 2>&1 so that stderr is merged into stdout. (e.g. like so: rm $file 2>&1)

antak
  • 19,481
  • 9
  • 72
  • 80
  • Hi antak, just wanted to say thanks a bunch for your answer. As Janito's is a little more detailed I am leaving it as the accepted answer, but I very much appreciate your help :). – Chris O'Kelly Oct 05 '12 at 02:27
0

Since your fail function is just exiting, it would be a lot easier to simply do:

set -e  # Abort on failure
exec 2>> "${LOG_DIR}/${LOG_FILE}"  # Append all errors to LOG_FILE
if cmd_output=$(rm $file)
  log "Successfully removed the original" ${TAB_LEVEL}
fi

The only difference between this and the original code is that it does not print the text FATAL ERROR:. Since brevity is a virtue, it would probably be better to skip the log function entirely. Report errors loudly; succeed silently.

William Pursell
  • 204,365
  • 48
  • 270
  • 300
  • Thanks William. In this case I do not want to succeed silently, though if I were writing a package for distribution or most other purposes I would agree. This is in-house software though, and just as important as it actually moving files is that it keeps very clear details as to what's been touched (a lot of what is being moved is sensitive or critical data). I also need the FATAL ERROR prefix - the log file is run through by the machine the data is copied to looking for that or a few other line prefixes. Thanks again though! – Chris O'Kelly Oct 07 '12 at 22:13