-1

I have mutliple (over 100) similar to this:

wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_02//00031327004/auxil/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_01//00031327001/uvot/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2010_12//00031856009/uvot/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2008_01//00031043003/uvot/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2012_01//00032237004/uvot/

I have been told that this can be completed very quickly and easy with a bash script, could someone give me an example if possible for multiple wget commands? What do I need to include within the script? Apologies for my n00b questions, but...I am one!

GCien
  • 2,221
  • 6
  • 30
  • 56

2 Answers2

2

Assuming your URL list looks like this:

http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_02//00031327004/auxil/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_01//00031327001/uvot/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2010_12//00031856009/uvot/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2008_01//00031043003/uvot/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2012_01//00032237004/uvot/

Just read your URLs from the text file using a while loop:

#!/bin/bash
while read url; do
    wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks "$url"
done < urls.txt
miken32
  • 42,008
  • 16
  • 111
  • 154
0

Assuming your noob question is about speeding up the process time,

you can spawn multiple process's in background & and wait for them

wget firsturl &
wget secondurl &
...
wait
Rozuur
  • 4,115
  • 25
  • 31