0

I'm trying to download a tarball file and save it locally with python. With urllib it's pretty simple:

import urllib    

urllib2.urlopen(url, 'compressed_file.tar.gz')
tar = tarfile.open('compressed_file.tar.gz')
print tar.getmembers()

So my question is really simple: What's the way to achieve this using the urllib2 library?

Dimitris Poulopoulos
  • 1,139
  • 2
  • 15
  • 36
  • I'm a little confused. Is the example you posted supposed to be working or broken code? You've imported `urllib` but are using `urllib2`. Also if you're open to it, `requests` is a really nice library for dealing with these sorts of things. – Suever Jan 16 '16 at 20:27

1 Answers1

1

Quoting docs:

urllib2.urlopen(url[, data[, timeout[, cafile[, capath[, cadefault[, context]]]]]) Open the URL url, which can be either a string or a Request object.

data may be a string specifying additional data to send to the server, or None if no such data is needed.

Nothing in urlopen interface documentation says, that second argument is a name of file where response should be written.

You need to explicitly write data read from response to file:

r = urllib2.urlopen(url)
CHUNK_SIZE = 1 << 20
with open('compressed_file.tar.gz', 'wb') as f:
    # line belows downloads all file at once to memory, and dumps it to file afterwards
    # f.write(r.read())
    # below is preferable lazy solution - download and write data in chunks
    while True:
        chunk = r.read(CHUNK_SIZE)
        if not chunk:
            break
        f.write(chunk)
Łukasz Rogalski
  • 22,092
  • 8
  • 59
  • 93