I have 500 folders each containing varying numbers (ranging between 20-40) of JSON collections. I'm able to extract the contents of each folder manually and individually in Python by following answers given here and here
import os, json
path_to_json = 'C:/Users/SomeFolder'
json_files = [pos_json for pos_json in os.listdir(path_to_json) if pos_json.endswith('.json')]
#list all the files in the folder
print (json_files)
for js in json_files:
with open(os.path.join(path_to_json, js)) as json_file:
#can only print to screen all the json files - need help saving to a tab-delimited file
print (json.load(json_file))
However, it would be quite laborious and obviously very tiresome considering this is must done 500 times. A faster automated approach for repetitively extracting the contents of each JSON folder into tab-delimited files would be most welcome. Thanks