I have a large json file (2.4 GB). I want to parse it in python. The data looks like the following:
[
{
"host": "a.com",
"ip": "1.2.2.3",
"port": 8
},
{
"host": "b.com",
"ip": "2.5.0.4",
"port": 3
},
{
"host": "c.com",
"ip": "9.17.6.7",
"port": 4
}
]
I run this python script parser.py
to load the data for parsing::
import json
from pprint import pprint
with open('mydata.json') as f:
data = json.load(f)
Traceback (most recent call last): File "parser.py", line xx, in data = json.load(f) File "/usr/lib/python3.6/json/init.py", line 296, in load return loads(fp.read(), MemoryError
1) Can you please advise me how to load large files for parsing without such an error?
2) Any alternative methods?