0

I have a large JSON file. I need to estimate the size of the machine I need to open the file. Here is the operation that I need to do:

f = open('mybigfile.txt')
data = json.loads(f.read())

I could do sizeof once I have the object. However, since I don't (yet) have enough memory to get data, how would I estimate this?

David542
  • 104,438
  • 178
  • 489
  • 842
  • 3
    It may technically be possible to estimate this, but doing so will require parsing the JSON (probably can't use `json`'s parser for this) and making numerous assumptions about CPython implementation details. This seems to be an [XY problem](http://meta.stackexchange.com/q/66377). What's are you actually trying to achieve? –  Jan 30 '15 at 22:23
  • @David542 You could use the size of the file as an estimate: http://stackoverflow.com/questions/2104080/how-to-check-file-size-in-python – quamrana Jan 30 '15 at 22:24
  • 2
    _"I need to estimate the size of the machine I need to open the file."_ is your JSON file really on the several GB range ? Or are you working on very constrained embedded system ? Can't you `mmap()` the file instead of doing a plain `read()` -- broadly speaking that would decrease the required RAM by (more than) 50%... – Sylvain Leroux Jan 30 '15 at 22:32

0 Answers0