I'm having a large json file which I'm struggling to read and work with in python. It seems I can for instance run json.loads()
but then it crashes after a while.
There are two questions which are basically the same thing:
Reading rather large JSON files
Is there a memory efficient and fast way to load big JSON files?
But these questions are from 2010 and 2012, so I was wondering if there's a newer/better/faster way to do things?
My file is on the format:
import json
f = open('../Data/response.json')
data = json.load(f)
dict_keys(['item', 'version'])
# Path to data : data['item']
Thanks.