Python's multiprocessing.Pool.imap
is very convenient to process large files line by line:
import multiprocessing
def process(line):
processor = Processor('some-big.model') # this takes time to load...
return processor.process(line)
if __name__ == '__main__':
pool = multiprocessing.Pool(4)
with open('lines.txt') as infile, open('processed-lines.txt', 'w') as outfile:
for processed_line in pool.imap(process, infile):
outfile.write(processed_line)
How can I make sure that helpers such as Processor
in the example above are loaded only once? Is this possible at all without resorting to a more complicated/verbose structure involving queues?