0

I Have three distinsts lists which contains lot of information. I would like to rewrite them , each in a specific file but at the same time.

I come up with this code but How can I adapt it to run for the other file & list (1 &2) at the same time

print(len(list_0))
print(len(list_1))
print(len(list_2))

outfile0 = 'corpus_phrases_mais.tsv'
outfile1 = 'corpus_phrases_lexique.tsv'
outfile2 = 'corpus_phrases_exp.tsv'

sous_dir = 'corpus_extract'

out_path = os.path.join(outdir, sous_dir)
if not os.path.exists(out_path):
    os.makedirs(out_path)


with open(os.path.join(out_path, outfile0), 'w', newline='', encoding='utf-8') as f_out: # encoding='utf-8', newline='') as f_out:
    tsv_output = csv.writer(f_out, delimiter='\t')  # \t => séparateur
    #tsv_output.writerow(['Verbatim','polarity', 'Nombre'])   # write first line
    tsv_output.writerow(['Verbatim'])

    for idx, line in enumerate(list_1):
      #tsv_output.writerow([line, labels[idx], numbers[idx]])
      tsv_output.writerow([line])

    print('Finished writing sentences to {}. : '.format(out_path))
Tomerikoo
  • 18,379
  • 16
  • 47
  • 61
kely789456123
  • 605
  • 1
  • 6
  • 21
  • 1
    Does this answer your question? [How can I open multiple files using "with open" in Python?](https://stackoverflow.com/questions/4617034/how-can-i-open-multiple-files-using-with-open-in-python) – Tomerikoo May 12 '20 at 14:30

1 Answers1

0

You can create a function write_in_file(data, filepath) and run in different processes using multiprocessing package. (https://docs.python.org/fr/3/library/multiprocessing.html)

from multiprocessing import Process
...
p1 = Process(target=write_in_file, args=(output0, filepath0)) 
p2 = Process(target=write_in_file, args=(output1, filepath1)) 
p3 = Process(target=write_in_file, args=(output2, filepath2)) 
p1.start()
p2.start()
p3.start()

// Wait process end
p1.join()
p2.join()
p2.join()
GabrielC
  • 322
  • 1
  • 6