0

I have a need to Down load several 1000 individual text files from a Climate repository. Each Weather Station has a directory, where simple ascii text files are stored, one for each day since the station has been recording...

Is there a way to write a script to down load all files from the diretory?

http://www.sidpabb.org/data/clima/raw/BeiraInam/

Regards

Rudy Van Drie
  • 91
  • 2
  • 11
  • Yes, there is a way. One way is using urllib2 and beautifulsoup. What did you have in mind and what did you try? – Benjamin Gruenbaum Mar 17 '13 at 21:52
  • Probably it will be helpful for you. http://stackoverflow.com/questions/9221022/equivalent-of-wget-in-python-to-download-website-and-resources – Ellochka Cannibal Mar 17 '13 at 21:56
  • I am not at all sure, where to begin with coding this...(Novice) will look at the link thanks – Rudy Van Drie Mar 18 '13 at 06:06
  • The other complication is that I need a user name & Password to get access... (which I have) but how is that used in a script? – Rudy Van Drie Mar 18 '13 at 06:44
  • Here is a typical file name: BeiraInam_BeiraInam_2012.4.21.dat, there is a file for each day.... – Rudy Van Drie Mar 18 '13 at 14:35
  • bit late to the party here, but use the awesome [requests](http://docs.python-requests.org/en/latest/) library for this. You should be able to specify username/password parameters pretty easily with this as well. – bananafish Jun 25 '13 at 06:41

0 Answers0