So right now I am combining many json files, transforming the data then writing out as rows to a CSV file. The problem is that this data will grow exponentially, and I may run into memory errors in the future.
How would I go about writing to CSV, but once the file size is greater than 1GB start writing to a new file?
This is my code for writing to one file:
with open('foo.csv', 'w', encoding='utf-8', newline='') as f:
for response_row in load_json():
try:
writer = csv.writer(f, delimiter="|")
writer.writerow(row)
except Exception as e:
logging.critical(str(e))