1

I have a JSON file where data is stored in records.

[{"count": 3, "articleTag": "muslims,myanmar,viral sach", "articleId": "587044", "deviceId": "0652c917-6241-4f1d-bad2-520ce8df0db5", "articleCategory": "VIDEO"}, {"count": 2, "articleTag": "vyakti vishesh,story,kim jong", "articleId": "587057", "deviceId": "0652c917-6241-4f1d-bad2-520ce8df0db5", "articleCategory": "VIDEO"}]

If i read this file pd.read_json(filepath,orient='records'). Then my system gets hanged. What's the best way to read file in chunks.

The file size would be in GB's. I need to read it in chunks.

Seema Mudgil
  • 365
  • 1
  • 7
  • 15
  • How much memory does your system have? How large is the largest file? – John Zwinck Oct 14 '17 at 05:56
  • 2
    Possible duplicate of [Reading rather large json files in Python](https://stackoverflow.com/questions/10382253/reading-rather-large-json-files-in-python) – Taku Oct 14 '17 at 05:58

0 Answers0