3

Consider the case where I have a very long bash script with several commands. Is there a simple way to check the exit status for ALL of them easily. So if there's some failure I can show which command has failed and its return code.

I mean, I don't want to use the test for each one of them checks like the following:

my_command
if [ $status -ne 0 ]; then
    #error case
    echo "error while executing " my_command " ret code:" $?
    exit 1
fi
chepner
  • 497,756
  • 71
  • 530
  • 681
rkachach
  • 16,517
  • 6
  • 42
  • 66

3 Answers3

4

You can do trap "cmd" ERR, which invokes cmd when a command fails. However this solution has a couple of drawbacks. For example, it does not catch a failure inside a pipe.

In a nutshell, you are better off doing your error management properly, on a case by case basis.

Hellmar Becker
  • 2,824
  • 12
  • 18
3

One can test the value of $? or simply put set -e at the top of the script which will cause it to exit upon any command which errors.

#!/bin/sh

set -xe

my_command1

# never makes it here if my_command1 fails
my_command2
  • The way I program is to use `set -xe` for everything, and exit upon first error reporting all commands to the user. I updated the post to reflect this. –  Oct 07 '15 at 13:09
  • If you want to hard wire a command to not trigger the program to exit upon error while under `set -xe` use: bad_command || true –  Oct 07 '15 at 13:10
2

You can write a function that launches:

function test {
    "$@"
    local status=$?
    if [ $status -ne 0 ]; then
        echo "error with $1" >&2
    fi
    return $status
}

test command1
test command2
chepner
  • 497,756
  • 71
  • 530
  • 681
Nullpointer
  • 1,895
  • 20
  • 26
  • 1
    this seems copied from https://stackoverflow.com/a/5195741/862225, no? If true, it would be nice to add a reference – Kostis Feb 11 '21 at 19:20