0

I'm trying to download all the pdf files from a directory using wget using this command:

  wget -r -A pdf http://website/~subfolder/

With this command, I can download only some files. I know there are more pdf files inside because I can reach them using google. I took a look here:

Download all pdf files using wget

but the solution doesn't work for me. Any suggestions? Thank you!

danglingpointer
  • 4,708
  • 3
  • 24
  • 42
Zero G
  • 117
  • 1
  • 1
  • 10
  • Are the files are located in subfolders? – danglingpointer Nov 13 '17 at 08:51
  • try this `wget -r -A "*.pdf" http://website/~subfolder/` – danglingpointer Nov 13 '17 at 08:53
  • With your command i obtain the same output. It's very strange because if i search the docs on google i can find them, and they're located in the same subfolder. – Zero G Nov 14 '17 at 11:28
  • UPDATE: i can see all the files thanks to google. The command to insert in the searchbar is: filetype:pdf site:http://website/~subfolder/ – Zero G Nov 14 '17 at 12:07
  • Does this answer your question? [How to download HTTP directory with all files and sub-directories as they appear on the online files/folders list?](https://stackoverflow.com/questions/23446635/how-to-download-http-directory-with-all-files-and-sub-directories-as-they-appear) – Software Engineer Nov 22 '20 at 08:54

0 Answers0