You can use starmap()
to solve this problem with pooling.
Given that you have a list of files, say in your working directory, and you have a location you would like to copy those files to, then you can import os
and use os.system()
to run terminal commands in python. This will allow you to move the files over with ease.
However, before you start you will need to create a variable res = [(file, target_dir) for file in file_list]
that will house each file with the target directory.
It will look like...
[('test1.pdf', '/home/mcurie/files/pdfs/'), ('test2.pdf', '/home/mcurie/files/pdfs/'), ('test3.pdf', '/home/mcurie/files/pdfs/'), ('test4.pdf', '/home/mcurie/files/pdfs/')]
Obviously, for this use case you can simplify this process by storing each file and target directory in one string to begin with, but that would reduce the insight of using this method.
The idea is that starmap()
is going to take each component of res
and place it into the function copy_file(source_file, target_dir)
and execute them synchronously (this is limited by the core quantity of your cpu).
Therefore, the first operational thread will look like
copy_file('test1.pdf', '/home/mcurie/files/pdfs/')
I hope this helps. The full code is below.
from multiprocessing.pool import Pool
import os
file_list = ["test1.pdf", "test2.pdf", "test3.pdf", "test4.pdf"]
target_dir = "/home/mcurie/files/pdfs/"
def copy_file(source_file, target_dir):
os.system(f"cp {source_file} {target_dir + source_file}")
if __name__ == '__main__':
with Pool() as p:
res = [(file, target_dir) for file in file_list]
for results in p.starmap(copy_file, res):
pass