I have a few scripts:
functions.sh - has many functions defined including work() and abort() it looks like this:
#!/bin/bash
abort()
{
message=$1
echo Error: $message ..Aborting >log
exit 7
}
work()
{
cp ./testfile ./test1 #./testfile doesnt exist so non zero status here
if [ $? -eq 0 ];then
echo "variable_value"
else
abort "Can not copy"
fi
}
parent.sh - parent script is the main script, it looks like this:
#!/bin/sh
. ./functions.sh
value=$(work)
echo "why is this still getting printed"
Basically i have many functions in the functions.sh file, and i am sourcing that file in parent.sh to make available all functions. Parent can call any function and any function in functions.sh can call abort at which point execution of parent should stop, but its not happening, parent.sh runs to the next step. Is there a way to get around this problem?
I realise that its happening due to assignment step value=$(work). But is there a way to abort execution just at abort function call in this case?