To further expand on my comment to @giant_teapot
the code I used to benchmark was...
#!/usr/bin/env python
import time
import os
import urllib2
#5mb mp3 file
testdl = "http://traffic.libsyn.com/timferriss/Arnold_5_min_-_final.mp3"
chunkmulti = 1
numpass = 5
while (chunkmulti < 207):
passtime = 0
passattempt = 1
while (passattempt <= numpass):
start = time.time()
req = urllib2.urlopen(testdl)
CHUNK = chunkmulti * 1024
with open("test.mp3", 'wb') as fp:
while True:
chunk = req.read(CHUNK)
if not chunk: break
fp.write(chunk)
end = time.time()
passtime += end - start
os.remove("test.mp3")
passattempt += 1
print "Chunk size multiplier ", chunkmulti , " took ", passtime / passattempt, " seconds"
chunkmulti += 1
the results weren't conclusive. Here's the first bunch of results...
Chunk size multiplier 1 took 13.9629709721 seconds
Chunk size multiplier 2 took 8.01173728704 seconds
Chunk size multiplier 3 took 10.3750542402 seconds
Chunk size multiplier 4 took 7.11076325178 seconds
Chunk size multiplier 5 took 11.3685477376 seconds
Chunk size multiplier 6 took 6.86864703894 seconds
Chunk size multiplier 7 took 14.2680369616 seconds
Chunk size multiplier 8 took 7.93746650219 seconds
Chunk size multiplier 9 took 6.81188523769 seconds
Chunk size multiplier 10 took 7.54047352076 seconds
Chunk size multiplier 11 took 6.84347498417 seconds
Chunk size multiplier 12 took 7.88792568445 seconds
Chunk size multiplier 13 took 7.37244099379 seconds
Chunk size multiplier 14 took 8.15134423971 seconds
Chunk size multiplier 15 took 7.1664044857 seconds
Chunk size multiplier 16 took 10.9474172592 seconds
Chunk size multiplier 17 took 7.23868894577 seconds
Chunk size multiplier 18 took 7.66610199213 seconds
Results continued like this up to a chunk size of 207kb
So I set the chunk size to 6kb. Might have a go at benchmarking this against wget next...