I need to loop through a 30meg json file which is a large file for just text. when simply using. Also I am using a feed from walmart https://developer.walmartlabs.com/docs/read/Special_Feeds. I am not sure if anyone is familiar with this.
data = json.loads(open('file.json').read())
print data
I get this error `
Traceback (most recent call last):
File "/home/python/Desktop/read.py", line 21, in <module>
data = json.loads(open('rolback.json').read())
File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
return _default_decoder.decode(s)
File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
`
I think it is because the file is so large. I have also tried to stream the file and get a memory error.
What options/what is recommended to deal with these very large files? Here is a link to a previouse questions that contains some of the output of the file. python ijson large file loop to get names