-1

I have three scripts Script1 - Deletes leftover files and copies new file for processing Script2 - Executes the cobol program for the newly copied files Script3 - Executes another cobol program for comparison and create a zip of the required output files.

Is there a way to execute them in the below order Script1 completed successfully, then execute Script2. If Script2 executed successfully, then execute Script3. If Script3 is complete, then send an email that the task is complete. If failure happens during execution of any script then, send an email about the script where the failure happened?

As of now, I am executing the scripts manually in the sequence and was looking for a way to automate the execution.

Here is the code that I tried but the execution sequence is getting messed up. Like script3 executed before script2 and I am sure that I am missing other checks as well

#!/bin/sh
#
nohup ./fgaudext1.sh >/data/shell/fgaud/log/nohup_fgaudext1.txt 2>&1 &
pid_1=$(pidof fgaudext1.sh)
wait ${pid_1}
nohup ./fgaudext2.sh >/data/shell/fgaud/log/nohup_fgaudext2.txt 2>&1 &
#
pid_2=$(pidof fgaudext2.sh)
wait ${pid_2}
nohup ./fgaudext3.sh >/data/shell/fgaud/log/nohup_fgaudext3.txt 2>&1 &
#
pid_3=$(pidof fgaudext3.sh)
wait ${pid_3}
#
echo "Recon complete for AUDIT files"

  • 1
    Does this answer your question? [Bash: Exit and cleanup on error](https://stackoverflow.com/questions/36335186/bash-exit-and-cleanup-on-error) – Aserre Jul 05 '23 at 13:03
  • 2
    just use `./script1.sh; ./script2.sh; ./script3.sh` to execute the scripts sequencially, and use the `trap` keyword along with `set -e` to send the email – Aserre Jul 05 '23 at 13:04
  • 2
    You have first to define what a _successful execution_ actually is. If the 3 scripts are written in a reasonable way, they should signal failure via an exit code, and you can simply do a `script1.sh && script2.sh && script3.sh &`. It does not make sense to execute the first two scripts in the background, since the script has to terminate in order to let you know whether or not it was successful. – user1934428 Jul 05 '23 at 13:30
  • Doesn't semicolon (;) execute the next script even if previous script failed? – Rahul Dubey Jul 05 '23 at 13:52
  • @user1934428 - A successful execution will return 0. Also, I need to execute the scripts in background so that the terminal can be used for other tasks. I need to execute all the processes in background only – Rahul Dubey Jul 05 '23 at 13:59
  • During `wait`, your terminal is occupied, so this script doesn't achieve your aim. You can do that by putting the whole pipeline in the background as user1934428 suggests. – Toby Speight Jul 05 '23 at 14:05
  • If you want to use the terminal for other things while the programs are running, or you want them to continue when the user logs out, consider using `tmux` or `screen`. See [How to prevent a background process from being stopped after closing SSH client in Linux](https://stackoverflow.com/q/285015/4154375), [Getting ssh to execute a command in the background on target machine](https://stackoverflow.com/q/29142/4154375), and [How to make a program continue to run after log out from ssh?](https://stackoverflow.com/q/954302/4154375). – pjh Jul 05 '23 at 19:04
  • Nothing speaks against running the whole **script** in the background. In your approach, you executed each command individually in the background. Also, please remove the _bash_ tag. bash does not seem involved in your problem. – user1934428 Jul 06 '23 at 05:20

1 Answers1

0

First, if bash use proper shebang. Then, you can exit with the value of the script that failed.

#! /bin/bash
for n in {1..3}; do
  /path/to/fgaudext${n}.sh >/data/shell/fgaud/log/nohup_fgaudext${n}.txt 2>&1 || exit $n
done
echo 'Send email'
Diego Torres Milano
  • 65,697
  • 9
  • 111
  • 134