6

I really want to download images from a website, but I don't know a lot of wget to do so. They host the images on a seperate website, how I do pull the image link from the website using cat or something, so I could use wget to download them all. All I know is the wget part. Example would be Reddit.com

  wget -i download-file-list.txt
c0rruptbytes
  • 229
  • 2
  • 6
  • You can't use only wget. You're going to have to write a shell script that uses pipes and regex. – Alex W Jul 29 '12 at 04:16
  • This has how to download a whole website with wget, which will include the images: http://www.thegeekstuff.com/2009/09/the-ultimate-wget-download-guide-with-15-awesome-examples/ – Alex W Jul 29 '12 at 04:18
  • Possible duplicate of http://stackoverflow.com/questions/4602153/how-do-i-use-wget-to-download-all-images-into-a-single-folder – mr rogers Feb 21 '14 at 16:38

2 Answers2

12

Try this:

wget -r -l 1 -A jpg,jpeg,png,gif,bmp -nd -H http://reddit.com/some/path

It will recurse 1 level deep starting from the page http://reddit.com/some/path, and it will not create a directory structure (if you want directories, remove the -nd), and it will only download files ending in "jpg", "jpeg", "png", "gif", or "bmp". And it will span hosts.

Jon Lin
  • 142,182
  • 29
  • 220
  • 220
2

I would use the perl module WWW::Mechanize. The following dumps all links to stdout:

use WWW::Mechanize;

$mech = WWW::Mechanize->new();
$mech->get("URL");
$mech->dump_links(undef, 'absolute' => 1);

Replace URL with the actual url you want.

Thor
  • 45,082
  • 11
  • 119
  • 130
  • nice method. However, can you change this so that it will take one link at a time from a text file ? – Cajuu' May 26 '15 at 14:28