5

Is there a way to get wget to download all of the image files directly linked from a given web page?

In this particular case, the web page contains several "img" tags and I just want to download those images. I can't seem to get this to work with any combination of -r, -l, -p, -A, etc. No matter what I do wget completely ignores the images, even when I try specifying -r -p -A jpg.

When I'm using -r it also seems to generate a lot of requests for the same page but with a variety of different query strings attached. It would be nice if I could avoid this too somehow.

Thanks for any pointers.

Pryo
  • 690
  • 1
  • 10
  • 24
  • can you please mention the url so we can help you ?! –  Dec 01 '12 at 20:57
  • possible duplicate of [How to download all files from a website using wget](http://stackoverflow.com/questions/8755229/how-to-download-all-files-from-a-website-using-wget) – Kijewski Nov 09 '13 at 15:00

1 Answers1

0

wget -nd -p --accept=.jpg <url>

-nd will not create directories -p says download the requisites for the page

This does not do recursive so it will only download jpg images for the current page, you can add -r if you want to do all the pages

Yisrael Dov
  • 2,369
  • 1
  • 16
  • 13