0

I have multiple number of json files saved in a folder. I would like to parse each json file, use the library flatten and save as a seperate json file.

I have managed to do this with one json, but struggling to parse several json files at once without merging the data and then save.

I think I need to create a loop to load a json file, flatten and save until there were no more json files in the folder, is this possible?

This still seems to only parse one json file.

path_to_json='json_test/'
for file in [file for file in os.listdir(path_to_json)if file.endswith('.json')]:
    with open(path_to_json + file) as json_file:
        data1=json.load(json_file)

Any help would be much appreciated thanks!

str
  • 42,689
  • 17
  • 109
  • 127
Lilly
  • 25
  • 4

2 Answers2

0
# Can yo try this out
# https://stackoverflow.com/questions/23520542/issue-with-merging-multiple-json-files-in-python

import glob

read_files = glob.glob("*.json")
with open("merged_file.json", "wb") as outfile:
    outfile.write('[{}]'.format(
                  ','.join([open(f, "rb").read() for f in read_files])))
0

Every looop 'data1' is assigned to new json files. therefore only returns one result. Instead, append to a new list.

import os
import json
# Flatten not supported on 3.8.3

path = 'X:test folder/'

file_list = [p for p in os.listdir(path) if p.endswith('.json')]
flattened = []

for file in file_list:
    with open(path + file) as json_file:
        # flatten json here, can't install from pip.
        flattened.append(json.load(json_file))

for file, flat_json in zip(file_list, flattened):
    json.dump(flat_json, open(path + file + '_flattened.json', "w"), indent=2)
jupiterbjy
  • 2,882
  • 1
  • 10
  • 28