I have a simple script that reads values from one csv, runs some internal function on them that takes 2-3 seconds each time, and then writes the results into another csv file.
Here is what it looks like, minus the internal function I referenced.
import csv
import time
pause = 3
with open('input.csv', mode='r') as input_file, \
open('output.csv', mode='w') as output_file:
input_reader = csv.DictReader(input_file)
output_writer = csv.writer(output_file, delimiter=',', quotechar='"',
quoting=csv.QUOTE_MINIMAL)
count = 1
for row in input_reader:
row['new_value'] = "result from function that takes time"
output_writer.writerow( row.values() )
print( 'Processed row: ' + str( count ) )
count = count + 1
time.sleep(pause)
The problem I face is that the output.csv
file remains blank until everything is finished executing.
I'd like to access and make use of the file elsewhere whilst this long script runs.
Is there a way I can prevent the delay of writing of the values into the output.csv
?
Edit: here is an dummy csv file for the script above:
value
43t34t34t
4r245r243
2q352q352
gergmergre
435q345q35