0

I have a shell script with multiple commands as below

cmd-1
cmd-2
cmd-3
....
cmd-n

I want the shell script execution to continue even if there is a failure in the middle e.g cmd-3 or cmd-7. To achieve this I used set +e. This allows me to continue the execution, but I am unable to capture the exit status of the failed command (since exit status of the script is always based on the last command). Is there any way to set the status of the complete script based on last failed command.

user1602397
  • 11
  • 1
  • 5
  • Possible duplicate of [Checking Bash exit status of several commands efficiently](http://stackoverflow.com/questions/5195607/checking-bash-exit-status-of-several-commands-efficiently) – Joe Oct 26 '16 at 10:27
  • check this out: http://stackoverflow.com/questions/5195607/checking-bash-exit-status-of-several-commands-efficiently – aliasav Oct 26 '16 at 10:28
  • Thanks Joe & aliasav for the above links. However set -e and '&&' could not solve the purpose since they terminate the scripts whenever error occurs. I want to continue till the last command even in case of failure but I want to set the exit status of the complete script to be based on the exit status of the last command that failed.Similar to set -o pipefail when commands are piped. In my case commands are not piped. – user1602397 Oct 26 '16 at 10:58
  • I'm not sure I see the point of signaling that an unknown process failed, especially when it wasn't serious enough to prevent the remaining commands from running. – chepner Oct 26 '16 at 11:23

1 Answers1

1

You can achieve it like that. Not too clean way, but quite easy.

exit_code=0
cmd-1 || exit_code=$?
cmd-2 || exit_code=$?
cmd-3 || exit_code=$?
....
cmd-n || exit_code=$?
exit $exit_code

The clean way would be to split your code to functions and check result of you commands there.

chepner
  • 497,756
  • 71
  • 530
  • 681
Hardy Rust
  • 1,600
  • 1
  • 10
  • 12