6

I have a bash script running bunch of stuff along with aws listobject command. Recently we noted that the std err would have an error code but the script does not exit with non zero code. I hoped that the set -e should handle any command failures but it doesnt seem to do the trick.

Script looks like this:

#!/bin/bash

set -e
# do stuff
aws s3api list-objects --bucket xyz --prefix xyx --output text --query >> files.txt
# do stuff

Error in Stderr :
An error occurred (SlowDown) when calling the ListObjects operation (reached max retries: 4): Please reduce your request rate.

Objective: I want the bash script to fail & exit when it encounters a problem with the aws cli commands. I can add an explicit check on ($? != 0) but wondering if there is better way to do this.

edocx
  • 87
  • 1
  • 6
  • As I remember, the execution of aws cli doesn’t fail with Exit code. It returns the string error. So, you would need to check the result of the aws command as a string. At least I did it in my project. – elbik Dec 17 '19 at 23:19
  • 2
    Check http://mywiki.wooledge.org/BashFAQ/105 and [Raise error in a Bash script](https://stackoverflow.com/q/30078281/6862601). – codeforester Dec 18 '19 at 00:06
  • Probably, aws cli isn't exiting non-zero in this case. – codeforester Dec 18 '19 at 00:13

1 Answers1

5

For me, this did the trick:

set -e -o pipefail

The @codeforrester's link says:

set -o pipefail is a workaround by returning the exit code of the first failed process
OzrenTkalcecKrznaric
  • 5,535
  • 4
  • 34
  • 57