6

Here is an example of my command:

wget -r -l 0 -np -t 1 -A jpg,jpeg,gif,png -nd --connect-timeout=10 -P ~/support --load-cookies cookies.txt "http://support.proboards.com/" -e robots=off

Based on the input here

But nothing really gets downloaded, no recursive crawling, it takes just a few seconds to complete. I am trying to backup all images from a forum, is the forum structure causing issues?

Bug
  • 2,576
  • 2
  • 21
  • 36
user3014632
  • 61
  • 1
  • 1
  • 3
  • 1
    Possible duplicate of http://stackoverflow.com/questions/4602153/how-do-i-use-wget-to-download-all-images-into-a-single-folder/21089847#comment33143483_21089847 – mr rogers Feb 21 '14 at 16:35

2 Answers2

19
wget -r -P /download/location -A jpg,jpeg,gif,png http://www.site.here

works like a charm

Ink
  • 191
  • 1
  • 8
  • 1
    in my case this downloads `robots.txt` file only – vladkras Nov 02 '16 at 11:59
  • 1
    in case you only get robots.txt then you can append '-e robots=off --wait 1 http://www.site.here ' to your wget command. This will overwrite the robots.txt file and fetch you the content you are looking for. Eg: wget -r -P /download/location -A jpg,jpeg,gif,png -e robots=off --wait 1 http://www.site.here – Ishan Sharma May 17 '17 at 10:07
  • Watch out using the recursive `-r` tag. This can cause all sorts of external files to be downloaded too. – clayRay Apr 28 '22 at 03:00
0

Download image file with another name. Here I provide the wget.zip file name as shown below.

# wget -O wget.zip http://ftp.gnu.org/gnu/wget/wget-1.5.3.tar.gz
--2012-10-02 11:55:54--  http://ftp.gnu.org/gnu/wget/wget-1.5.3.tar.gz
Resolving ftp.gnu.org... 208.118.235.20, 2001:4830:134:3::b
Connecting to ftp.gnu.org|208.118.235.20|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 446966 (436K) [application/x-gzip]
Saving to: wget.zip
100%[===================================================================================>] 446,966     60.0K/s   in 7.5s
2012-10-02 11:56:02 (58.5 KB/s) - wget.zip