I have multiple .gz files that add up to 1TB in total. How can I utilize Python 2.7 to unzip these files in parallel? looping on the files takes too much time.
I tried this code as well:
filenames = [gz for gz in glob.glob(filesFolder + '*.gz')]
def uncompress(path):
with gzip.open(path, 'rb') as src, open(path.rstrip('.gz'), 'wb') as dest:
shutil.copyfileobj(src, dest)
with multiprocessing.Pool() as pool:
for _ in pool.imap_unordered(uncompress, filenames, chunksize=1):
pass
However I get the following error:
with multiprocessing.Pool() as pool:
AttributeError: __exit__
Thanks!