0

What I'm basically trying to do is automatically detect if there is text in a line, and if so create a new variable containing the text in said line , within a script. If there is no text in a line then the variable doesn't get created. I can do this manually by opening the file -

$ cat file.txt
  sometxt
  somemoretext
  evenmoretext
  ...

then adding to my script the appropriate lines -

TXT=file.txt
VAR1=$(sed -n 1p $TXT)
VAR2=$(sed -n 2p $TXT)
...

but this is a pain since I have to count how many lines there are total, then copy and paste each line assigning the variables and changing 'VAR!' to 'VAR2' and '1p' to '2p'. There has to be an easier way. Thanks

  • Like [this](http://stackoverflow.com/questions/16414410/delete-empty-lines-using-sed)? I don't think you are going to need variables with a proper `sed` or `awk` or `grep` script. – JNevill Feb 19 '16 at 13:50
  • well, no because I need each the variables to used with another command... – Nathaniel Davidson Feb 19 '16 at 14:03
  • In that case you'll want to [loop through the file](http://stackoverflow.com/questions/1521462/looping-through-the-content-of-a-file-in-bash). Although you may still be able to do everything you want using `awk`, it's a very powerful program. – JNevill Feb 19 '16 at 14:05
  • Thx for the help,,,ok, heres what I'm trying to do - I writing a script that accepts user input which then will search a server for that term, then saves the results (http links) to a text file. Only thing is the results only contain the path to the file, not the entire URL...like '/path/to/file' instead of www.server.com/path/to/file. So I've made a Variable with the server URL called VAR and I want each variable after that (VAR1, VAR2) to contain the path to the file thats printed on each line of this text file. I then will grab the page with curl $VAR$VAR1 etc. – Nathaniel Davidson Feb 19 '16 at 14:22
  • Gotcha. Then the link where you loop through the file with `while read ; do` is the right way to go. Inside the loop you'll have something like `curl ${url}$ ... ` – JNevill Feb 19 '16 at 14:32
  • What shell are you working in? Have you considered using arrays? – Kusalananda Feb 19 '16 at 15:37

1 Answers1

0

@JNevil thanks for pointing me in the right direction.

Heres what ended up working for me -

for var_name in (cat links.txt); do
  wget <servername.com>$var_name 
done

Still dont know how to use curl but this worked fine!

  • [How can I read a file (data stream, variable) line-by-line (and/or field-by-field)?](http://mywiki.wooledge.org/BashFAQ/001) – chepner Feb 19 '16 at 16:50