1

Is there any way of getting download links from a website and put those like in a text file? To download hose files later in with wget ?

2 Answers2

0

You need to download the source of the website. You can use wget link-of-the-webiste-you-want-to-grab-links-from for that. Than you can sed the links like this: sed -n 's/.*href="\([^"]*\).*/\1/p' file

The this questions for details.

Community
  • 1
  • 1
Henrik Pingel
  • 3,083
  • 1
  • 19
  • 27
0

With this you can download jpg file, instead of jpg you can give any file format which should be present in source_file. Your downloading links list will be in link.txt

grep -Po 'href=\"\/.+\.jpg' source_file | sed -n 's/href="\([^"]*\)/\1/p' >link.txt; wget -i link.txt  
Sant
  • 53
  • 7