So I have many files to download from a server. Over 1000... So I thought I'd write a script that would do this for me that is multithreaded so that I don't have to wait for ages for it to finish. The problem is that it spits out a bunch of errors. I have searched for this but couldn't really find anything that seemed to be related to the error that I'm having as I don't print out any output in my other threads.
my plan was to have the threads chain start each other so that they don't happen to take one file twice and skip some other file.
thanks for any help!
mylist = [list of filenames in here]
mycount = 0
def download():
global mycount
url = "http://myserver.com/content/files/" + mylist[mycount]
myfile = urllib2.urlopen(url)
with open(mylist[mycount],'wb') as output:
output.write(myfile.read())
def chain():
global mycount
if mycount <= len(mylist) - 1:
thread.start_new_thread( download, ())
mycount = mycount + 1
chain()
chain()