0

This is one of the methods I used. None of the methods including this work & end up freezing my laptop because the large data file cant be read.

import json

infile = open("myfile.json","r")
outfile = open ("myfile.csv","w")

writer = csv.writer(outfile)

for row in json.loads(infile.read()):
  writer.writerow(row)
  • Try breaking the file into small chunks. Like, do it that after every 20 rows are written in csv it will type `Done!`. It will take time but I think it won't freeze your pc. – Dr. Strange Codes Jun 05 '21 at 03:50
  • If your JSON is well structured you will need to read a record at time by looking for the split between records e.g. a close brace at col 4. If it is not well structured, you would need to parse it a character at a time to count braces. – Martin Evans Jun 06 '21 at 17:20
  • If `jq` is an acceptable alternative, see https://stackoverflow.com/a/49809249/1164295 -- you could split the JSON into chunks, then convert to CSV – Ben Jun 07 '21 at 19:41

0 Answers0