0

I'm trying to create a shell script (e.g., test.sh) that executes multiple commands, and redirect each command's stdout and stderr to a specific log file.

But, if a command fails, after writing stderr to the log file, I want to abort, rather than continuing with the remaining commands.

Will the following work, or do I need to do something different?

    command1 > log1.txt 2>&1
    command2 > log2.txt 2>&1
    command3 > log3.txt 2>&1
Dimitri S.
  • 156
  • 1
  • 8
  • Note that the order of operations for `foo >out` is that it *first* opens `out`, and *then* runs `foo` (with its output streaming to the previously-opened file as it's written). Thus, you don't need to worry about an immediate exit preventing the file from being written. – Charles Duffy Jan 06 '19 at 03:01

1 Answers1

0

Make sure that each command returns an error code (ie non-zero) on failure, then you can use that to bail the containing script eg:

if command1 > log1.txt 2>&1; then
  if command2 > log2.txt 2>&1; then
    command3 > log3.txt 2>&1
  fi
fi

Or, more tersely:

command1 > log1.txt 2>&1 &&
command2 > log2.txt 2>&1 &&
command3 > log3.txt 2>&1
wef
  • 351
  • 3
  • 12