I have a JSON file where data is stored in records.
[{"count": 3, "articleTag": "muslims,myanmar,viral sach", "articleId": "587044", "deviceId": "0652c917-6241-4f1d-bad2-520ce8df0db5", "articleCategory": "VIDEO"}, {"count": 2, "articleTag": "vyakti vishesh,story,kim jong", "articleId": "587057", "deviceId": "0652c917-6241-4f1d-bad2-520ce8df0db5", "articleCategory": "VIDEO"}]
If i read this file pd.read_json(filepath,orient='records')
. Then my system gets hanged. What's the best way to read file in chunks.
The file size would be in GB's. I need to read it in chunks.