I have a bunch of jobs that need to be submitted and I want their outputs to be written into a file instead of being printed on the terminal. To this end, I use "nohup ./do.sh &" inside a shell script "1.sh". The latter is executed using "./1.sh &" on the terminal. However, my job runs in a computational cluster and the internet connection is patchy here. Is there a way to run the whole job submission process in the background just so internet disruption does not end the process?
The content of 1.sh:
while [some condition]
do
ifort -O3 iDFT_grafted.f90
nohup ./do.sh &
cd ../
done
cd ../
The content of do.sh
./a.out
Additional information- I tried nohup ./1.sh&. But this command writes the output of all individual jobs in separate folders into one nohup.out file. I want the output of all individual jobs in nohup.out files in their respective folders.
Update- The problem has been sorted out. The trick is to use simply "./do.sh>"output.txt"&" inside the job submission script "1.sh" and submit the job script with the command "nohup ./1.sh &". This lets the job submission continue even when I exit the terminal.