I am using aria2
to download some data with the option --on-download-complete
to run a bash
script automatically to process the data.
aria2c --http-user='***' --http-passwd='***' --check-certificate=false --max-concurrent-downloads=2 -M products.meta4 --on-download-complete=/my/path/script_gpt.sh
Focusing on my bash
script,
#!/bin/bash
oldEnd=.zip
newEnd=_processed.dim
for i in $(ls -d -1 /my/path/S1*.zip)
do
if [ -f ${i%$oldEnd}$newEnd ]; then
echo "Already processed"
else
gpt /my/path/graph.xml -Pinput1=$i -Poutput1=${i%$oldEnd}$newEnd
fi
done
Basically, everytime a download is finished, a for
loop starts. First it checks if the downloaded product has been already processed and if not it runs a specific task.
My issue is that everytime a download is completed, the bash
script is run. This means that if the analysis is not finished from the previous time the bash
script was run, both tasks will overlap and eat all my memory resources.
Ideally, I would like to:
Each time the
bash
script is run, check if there is still and ongoing process.If so, wait until it is finished and then run
Its like creating a queu of task (like in a for
loop where each iteration waits until the previous one is finished).
I have tried to implement the solutin with wait
or identifying the PID
but nothing succesfull.
Maybe changing the approach and instead of using aria2
to process the data that is just donwloaded, implemente another solution?