0

I have bash script that executes multiple times in single a job. If the script fails I want to exit out of the job and mark the job as failed.

plan.sh

#!/bin/bash
terraform init 
terraform plan -out=${PLAN_FILE_NAME}

gitlab job

plan:
 before_script:   
     - chmod +x ${CI_PROJECT_DIR}/gitlab/deploy/plan/plan.sh  
     - export PATH="$PATH:${CI_PROJECT_DIR}/gitlab/deploy/plan"   
 stage:plan
 script:
    - cd ${FOLDER1}
    - plan.sh

    - cd ${FOLDER2}
    - plan.sh

    - cd ${FOLDER2}
    - plan.sh
 artifacts:
   - "${FOLDER1}/${PLAN_FILE_NAME}"
   - "${FOLDER2}/${PLAN_FILE_NAME}"
   - "${FOLDER3}/${PLAN_FILE_NAME}"  

currently if the command terraform init or terraform plan fails for a folder, still the script keeps executing for the next folder and finally its marking the job as successful.
I want to exit out of job and mark the job as failed when there is an error during terraform init or terraform plan.

What would be a better way to handle this?

LP13
  • 30,567
  • 53
  • 217
  • 400
  • I think gitlab pipelines get marked as failed as soon as the script stage exits with a non-zero exit code. Why not do something like `if ! terraform init; then exit 1; fi` etc etc – Tyler Stoney Feb 01 '23 at 22:21
  • 1
    Basically GitLab already does what you’re asking, but your shell script isn’t exiting and returning a nonzero exit code when an error is encountered on one of the lines – Michael Delgado Feb 01 '23 at 22:30
  • So, in very brief, change `#!/bin/bash` to `#!/bin/bash -e` or add `set -e` as the next line of the script. (The latter is somewhat more robust, in that it also works if you explicitly run the script with `bash plan.sh` instead of the otherwise required `bash -e plan.sh`. But you are simply running `plan.sh` so this point is moot here.) – tripleee Feb 02 '23 at 06:10

0 Answers0