3

I tried to use the code below to import Json data into python. The file is 355MB.

import json
CATALOG = json.loads(open('Myfile.json').read())

I got error message as below:

"File "D:.....py", line 8, in CATALOG = json.loads(open('Myfile.json').read()) File "C:\Python27\lib\json__init__.py", line 339, in loads return _default_decoder.decode(s) File "C:\Python27\lib\json\decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Python27\lib\json\decoder.py", line 380, in raw_decode obj, end = self.scan_once(s, idx) MemoryError"

I down size the file to 100MB, it imported successfully. Is there a size limit for Python json? Or I can reset the limit? 

Thanks,

Jen
  • 143
  • 1
  • 2
  • 7
  • 2
    Its how much memory is available on your machine. Python has a significant overhead for small objects like those typically found in json files. `pandas` has less overhead. I don't know whether you and fiddle with the json but its `read_json` method may work. – tdelaney Apr 05 '18 at 20:02
  • the way you do it, `read` and then `loads` means that the full 355mb file contents resides in memory. You'd be better off with `json.load(open('myfile.json'))` – tdelaney Apr 05 '18 at 20:04
  • Possible duplcate, including suggestions such as `ijson`: https://stackoverflow.com/questions/10382253/reading-rather-large-json-files-in-python?utm_medium=organic&utm_source=google_rich_qa&utm_campaign=google_rich_qa – tdelaney Apr 05 '18 at 20:06
  • 1
    @tdelaney `json.load(fp, ...)` is just a wrapper around `json.loads(fp.read(), ...)` anyway. – chepner Apr 05 '18 at 20:09

0 Answers0