-1

I have about 300 small files that I need to download from a website. All are located in one directory. The files are of different sizes and have different extensions. I don't want to type each one into my web browser and then click 'save as' and such. I want to give my list to python and have it download and save each file in a directory. If python can simply download the directory, that would be even better.

Cody Brown
  • 1,409
  • 2
  • 15
  • 19
  • 1
    If this is all what you need, use wget or curl. If you really want a Python solution (like: you're not on Unix), you can use a package like this one: https://pypi.python.org/pypi/wget – michaelmeyer Jun 20 '13 at 14:52
  • 3
    Probably easier to just use [`wget`](http://en.wikipedia.org/wiki/Wget) with the `--recursive` option. – Aya Jun 20 '13 at 14:52
  • Go to the website with chrome and do `right-click + save as` . You will get a folder with all the files on the site along with its html – rassa45 Jun 28 '15 at 16:30
  • @ytpillai lol such a bad solution. That would never work with a GDB. – Cody Brown Jun 28 '15 at 16:42
  • And he said that he specifically did not want to do that......wow I'm an idiot – rassa45 Jun 28 '15 at 16:44
  • I found an answer on the internet but I can't write it in the comments, please ask another question – rassa45 Jun 28 '15 at 16:49

2 Answers2

3

This is all detailed here. I would favor using Requests as it's generally great, but urllib2 is in the standard library so doesn't require the installation of a new package.

Community
  • 1
  • 1
richsilv
  • 7,993
  • 1
  • 23
  • 29
2

If you're on python 3.3, you're looking for urllib:

import urllib.request
url = r"https://www.google.com/images/srpr/logo4w.png"
opener = urllib.request.urlopen(url)
file_out = open("my_image.png", "wb")
file_out.write(opener.readall())
file_out.close()

You should now have a file in your working directory called "my_image.png"

ccray
  • 101
  • 4