I keep getting a 'Too many open files' error when doing something like this:
# read file names
file_names = []
for file_name in os.listdir(path):
if '.json' not in file_name: continue
file_names.append(file_name)
# process file names...
# iter files
for file_name in file_names:
# load file into DF
file_path = path + '/' + file_name
df = pandas.read_json(file_path)
# process the data, etc...
# not real var names, just for illustration purposes...
json_arr_1 = ...
json_arr_2 = ...
# save DF1 to new file
df_1 = pandas.DataFrame(data=json_arr_1)
file_name2 = os.getcwd() + '/db/' + folder_name + '/' + file_name
df_1.to_json(file_name2, orient='records')
# save DF2 to new file
df_2 = pandas.DataFrame(data=json_arr_2)
file_name3 = os.getcwd() + '/db/other/' + folder_name + '/' + file_name
df_2.to_json(file_name3, orient='records')
The DF documentation doesn't mention having to handle open or closed files and I don't think listdir keeps pointers to open files (should just return a list of strings).
Where am I going wrong?