0

I have a bash script file run_job.sh and another script file send_email.sh. I want all run_job.sh's output to go to a log file and only errors (if any) to be sent to the file send_email.sh. send_email.sh is a file that expects two args- email body and email subject. The error should be sent as the email body.

Here's my code:

run_job.sh


{
  printf "\n----- PROCESS STARTED %s -----\n" "$(date +%Y-%m-%d\ %H:%M:%S)"

  COMMAND="cd $JOB_FOLDER_PATH && $EXEC_COMMAND"
  printf "Executing command: %s\n\n" "$COMMAND"
  eval "$COMMAND"
  COMMAND_EXIT_CODE=$?

  printf "\n----- PROCESS ENDED %s -----\n\n" "$(date +%Y-%m-%d\ %H:%M:%S)"
} &>> "$OUT_FILE" 2> ERROR_OUTPUT # this doesn't work

if [ $COMMAND_EXIT_CODE -ne 0 ]; then
    bash ./send_email.sh "Job Failed" $ERROR_OUTPUT
fi

send_email.sh

MAIL_SUBJECT=$1
MAIL_BODY=$2
echo "$MAIL_BODY" | \
mail -a "From: Kaizen Service <service@kaizen.co.uk>" \
     -a "Subject: $MAIL_SUBJECT" \
     "$(grep -o '^[^#]*' $MAILING_LIST_FILE | tr '\n' ',')"

Tamir Hen
  • 11
  • 2
  • Does this answer your question? https://stackoverflow.com/a/818265/11286032 – Marcelo Paco Mar 14 '23 at 17:35
  • No. These are all redirections to files... – Tamir Hen Mar 14 '23 at 17:40
  • please update the question with details on what you mean by *`this doesn't work`* ... error? no output? wrong output? process hangs? something else? – markp-fuso Mar 14 '23 at 17:45
  • When you say "ALL" the output, do you actually mean "the output **and also the errors*"? If so, say that. The word "output" only means stdout, it _doesn't include_ stderr. – Charles Duffy Mar 14 '23 at 17:50
  • Note that when you do `&>>`, you're redirecting both stdout and stderr, but then when you use `2>` after it you're redirecting stderr, _overriding the redirection of stderr previously done by `&>>`_ so only the stdout part of that prior redirection remains in place. – Charles Duffy Mar 14 '23 at 17:51
  • The way to think about this is that redirections are applied in order, left-to-right, and it's the net result of performing _all_ of them that sets out which file descriptors your program (or, in this case, block of code) has in place when it's started. – Charles Duffy Mar 14 '23 at 17:54
  • You could send the error output to a file and read the input for `send_email.sh` from this file. maybe related: https://stackoverflow.com/q/2559076/10622916 – Bodo Mar 14 '23 at 17:55

1 Answers1

0

If I understood your question correctly, then I guess there is no simple solution for what you want (redirect the error output to a bash variable instead of file and use this variable to call another script). One easy way to solve is to redirect the error to a temporary file:

} 1>> "$OUT_FILE" 2> /tmp/Error 

Then read the error from the temporary file.

ERROR_OUTPUT=$(</tmp/Error)

and then use it in your call

bash ./send_email.sh "Job Failed" $ERROR_OUTPUT

or read the error file (/tmp/Error) directly inside the send_email.sh script

See this thread.

j23
  • 3,139
  • 1
  • 6
  • 13
  • 1
    There _are_ direct solutions, but they're bad -- by "bad", meaning that they lose ordering guarantees. (Normally, when both stdout and stderr are copies of the same file descriptor, doing a write to one then the other will cause the output to be stored in that order in the file; as soon as you do something like what the OP is asking for here, you need a tool like `tee` in play in one file descriptor but not the other; they're no longer copies, so ordering guarantees no longer apply). – Charles Duffy Mar 14 '23 at 17:51
  • @CharlesDuffy thank you for the comment. I have updated the answer. Direct solution seems to be messy long one liner. – j23 Mar 14 '23 at 17:53
  • 1
    @j23 thank you for the solution. I thought about it but wasn't sure if this approach smells. Thought there was an obvious solution that I missed but probably not. I'll just use this approach – Tamir Hen Mar 14 '23 at 21:12
  • @TamirHen also assumed it and came across the above SO thread to realise that there is no simple solution. Hope the thread and answer helped. – j23 Mar 15 '23 at 09:44