I have a big list of remote file locations and local paths where I would like them to end up. Each file is small, but there are very many of them. I am generating this list within Python.
I would like to download all of these files as quickly as possible (in parallel) prior to unpacking and processing them. What is the best library or linux command-line utility for me to use? I attempted to implement this using multiprocessing.pool, but that did not work with the FTP library.
I looked into pycurl, and that seemed to be what I wanted, but I could not get it to run on Windows 7 x64.