I am transferring a 150-200mb file to many locations (shared drives located across the world) daily. The issue is that each transfer (using shutil) takes probably 100-700 seconds and each one has to complete in order for the next one to begin. It now takes like a full hour to transfer some files if I do it that way. My temporary solution was to create a separate .py file to run for each location so they can be done simultaneously, but that is not ideal.
How can I dip into multi-thread programming? I'd like to run all of the transfers at once but I have zero experience with this.
A simple google search landed me with:
https://docs.python.org/3/library/concurrent.futures.html.
import shutil
with ThreadPoolExecutor(max_workers=4) as e:
e.submit(shutil.copy, 'src1.txt', 'dest1.txt')
e.submit(shutil.copy, 'src2.txt', 'dest2.txt')
e.submit(shutil.copy, 'src3.txt', 'dest3.txt')
e.submit(shutil.copy, 'src4.txt', 'dest4.txt')
Can someone point me into the right direction? I have been meaning to learn how to do things in parallel for a while now but never got around to it.