8

I've been trying to create a simple script that will take a list of file to be downloaded from a .txt file, then using a loop it will read the .txt what files needs to be downloaded with the help of the other separated .txt file where in the address of the files where it will be downloaded. But my problem is I don't know how to do this. I've tried many times but I always failed.

file.txt
1.jpg
2.jpg
3.jpg
4.mp3
5.mp4

=====================================

url.txt
url = https://google.com.ph/

=====================================

download.sh
#!/bin/sh
url=$(awk -F = '{print $2}' url.txt)
for i in $(cat file.txt);
do 
wget $url
done

Your help is greatly appreciated.

user3534255
  • 137
  • 1
  • 3
  • 14
  • Never use `for var in $(command)`. See this answer : http://stackoverflow.com/questions/19606864/ffmpeg-in-a-bash-pipe/19607361?stw=2#19607361. Otherwise, you could use `cut` instead of `awk`in this case. – Idriss Neumann May 05 '14 at 04:49
  • I have a question, what should I do to the URL file if the address bar is in the second column and in the second row ? i mean the awk command for that .. – user3534255 May 05 '14 at 06:10

2 Answers2

10

Other than the obvious issue that R Sahu pointed out in his answer, you can avoid:

  • Using awk to parse your url.txt file.
  • Using for $(cat file.txt) to iterate through file.txt file.

Here is what you can do:

#!/bin/bash

# Create an array files that contains list of filenames
files=($(< file.txt))

# Read through the url.txt file and execute wget command for every filename
while IFS='=| ' read -r param uri; do 
    for file in "${files[@]}"; do 
        wget "${uri}${file}"
    done
done < url.txt
Community
  • 1
  • 1
jaypal singh
  • 74,723
  • 23
  • 102
  • 147
9

Instead of

wget $url

Try

wget "${url}${i}"
R Sahu
  • 204,454
  • 14
  • 159
  • 270