I've received an error trying to import a .csv file from the csv module when my field size exceeded 131,072. The csv module exports files with fields exceeding 131,072. It's my value for the dictionary with the massive size. My keys are small. Do I need a different file format to store dictionaries with huge values?
I use csv throughout my program, using it consistently is convenient. If multiple data types is unavoidable, what is a good alternative? I'd like to store values which could be thousands-millions of characters in length.
Here's the error message
dictionary = e.csv_import(filename)
File "D:\Matt\Documents\Projects\Python\Project 17\e.py", line 8, in csv_import
for key, value in csv.reader(open(filename)):
_csv.Error: field larger than field limit (131072)
Here's my code
def csv_import(filename):
dictionary = {}
for key, value in csv.reader(open(filename)):
dictionary[key] = value
return dictionary
def csv_export(dictionary, filename):
csv_file = csv.writer(open(filename, "w"))
for key, value in dictionary.items():
csv_file.writerow([key, value])