0

I have a saved JSON file in my disk which is 6.1 GB. My requirement is to keep that dictionary file memory so that I can replace keys with values in another file.

But I ended up getting Memory error each time when I try to do this step. Can anybody help me load dictionary without getting memory error?

Code is as follows:

with open('file_name.json') as f:
    dictionary = json.load(f)
jonrsharpe
  • 115,751
  • 26
  • 228
  • 437
rk_acumen
  • 21
  • 2
  • 3
    6Gb of JSON data on disk doesn't mean 6Gb of Python objects in memory. – jonrsharpe Sep 21 '20 at 10:22
  • [this](https://stackoverflow.com/questions/10382253/reading-rather-large-json-files-in-python) should help you – Yash Sep 21 '20 at 10:23
  • 1
    You have multiple overheads applied in this case: 1/ you have 6Gb of JSON in ASCII text file format, which depending on the internal structure of Python dictionaries, can take more than 6Gb of RAM (as @jonrsharpe said). 2/ the OS and other software also consume RAM, so you can't consider having 16Gb of usable RAM. 3/ If Python requires that the memory needed for your dictionary must be contiguous, then it might be difficult for your system to provide such a big contiguous chunk of RAM – A. Gille Sep 21 '20 at 12:14
  • If the amount of input data is high and random access to it is required, I would suggest other approaches like, an SQLite-based processing, loading only partial data from the disk, caching a subset of your data instead of loading it all, etc – A. Gille Sep 21 '20 at 12:18

1 Answers1

0

Your computer won't give acces to any amount of RAM to a process. You should check the limit on your system what the limit is for your Python process, information there.

Jao
  • 558
  • 2
  • 12