1

I am trying to write to a text file in append mode. The write() method is being called in a function that is run through multiprocessing. I am closing the file at the very end of the code thinking that everything will have been written before it is closed. But, what is happening is totally opposite. The file gets closed in each process. I want to have it closed once all the processes have ended.

This is how I am doing this.

import concurrent.futures

f = open('input.txt', 'a')
pairs = ['pair1', 'pair2', 'pair3', 'pair4', 'pair5']

def validity_check(pair):
        f.write(f'{pair}\n')

if __name__ == '__main__':
    with concurrent.futures.ProcessPoolExecutor() as executor:
        idx = 0
        while True:
            executor.map(validity_check, pairs[idx:idx + 5])
            idx = idx + 5
            if idx >= len(pairs):
                break

f.close()

I want all the pairs written to the file before it closes. Thanks!

Umar Aftab
  • 147
  • 4
  • 15

1 Answers1

0

The answer in this question maybe provides a solution to your problem: Python multiprocessing safely writing to a file

This way you would have a handler/manager that write to the file and all those processes give the information to the handler. The handler manages all file I/O.