0

I am trying to write a shell script that reads a file line by line and executes a command with its arguments taken from the space-delimited fields of each line.

To be more precise, I need to download a file from an URL which is given in the second column to the path given in the first column using wget. But I don't know how to load this file and get the values in script.

File.txt

file-18.log https://example.com/temp/file-1.log
file-19.log https://example.com/temp/file-2.log
file-20.log https://example.com/temp/file-3.log
file-21.log https://example.com/temp/file-4.log
file-22.log https://example.com/temp/file-5.log
file-23.pdf https://example.com/temp/file-6.pdf

Desired output is

wget url[1] -o url[0]

wget https://example.com/temp/file-1.log -o file-18.log
wget https://example.com/temp/file-2.log -o file-19.log
...
...
wget https://example.com/temp/file-6.pdf -o file-23.pdf
shilch
  • 1,435
  • 10
  • 17
mebb
  • 123
  • 10
  • Does this answer your question? [Bash: read a file line-by-line and process each segment as parameters to other prog](https://stackoverflow.com/questions/7619438/bash-read-a-file-line-by-line-and-process-each-segment-as-parameters-to-other-p) – Léa Gris Dec 08 '20 at 20:28

3 Answers3

5

Use read and a while loop in bash to iterate over the file line-by-line and call wget on each iteration:

while read -r NAME URL; do wget "$URL" -o "$NAME"; done < File.txt
shilch
  • 1,435
  • 10
  • 17
1

Turning a file into arguments to a command is a job for xargs:

xargs -a File.txt -L1 wget -o
  • xargs -a File.txt: Extract arguments from the File.txt file.
  • -L1: Pass all arguments from 1 line to the command.
  • wget -o Pass arguments to the wget command.
Léa Gris
  • 17,497
  • 4
  • 32
  • 41
  • There may be issues with this approach if the address contains "&" and it needs quoting. – Raman Sailopal Dec 08 '20 at 20:41
  • @RamanSailopal `xargs` has no issue passing URL argument with an `&` ampersand sign, tested and verified with this URL : `https://example.com/temp/file-1.log?foo=bar&baz=2+qux=corge` – Léa Gris Dec 08 '20 at 21:14
  • It _does_, however, mangle literal backslashes and quotes and split on non-newline whitespace unless used with extensions such as `-d` or `-0`. Without `-d $'\n'` or `-0`, I'm very hesitant to recommend xargs. – Charles Duffy Dec 08 '20 at 22:52
  • @CharlesDuffy If you have literal white-spaces unencoded backslashes and quotes in your URL, then you have encoding issues with: https://tools.ietf.org/html/rfc3986 and also https://stackoverflow.com/questions/10438008/different-behaviours-of-treating-backslash-in-the-url-by-firefox-and-chrome – Léa Gris Dec 09 '20 at 07:34
0

You can count, using a for loop and the output of seq like so:

In bash, you can add numbers using $((C+3)).

This will get you:

COUNT=6
OFFSET=18

for C in `seq "$((COUNT-1))"`; do
  wget https://example.com/temp/file-${C}.log -o file-$((C+OFFSET-1)).log
done

wget https://example.com/temp/file-${COUNT}.pdf -o file-$((COUNT+OFFSET-1)).pdf

Edit: Sorry, I misread your question. So if you have a file with the file mappings, you can use awk to get the URL and the FILE and then do the download:

cat File.txt | while read L; do
  URL="$(echo "${L}" | awk '{print $1}'"
  FILE="$(echo "${L}" | awk '{print $2}'"
  wget "${URL}" -o "${FILE}"
done
Xoozee
  • 370
  • 2
  • 9