-1

So clearly there cannot be unlimited memory in Python. I am writing a script that creates lists of dictionaries. But each list has between 250K and 6M objects (there are 6 lists).

Is there a way to actually calculate (possibly based on the RAM of the machine) the maximum memory and the memory required to run the script?

The actual issue I came across:

In running one of the scripts that populates a list with 1.25-1.5 million dictionaries, when it hits 1.227... it simply stops, but returns no error let alone MemoryError. So I am not even sure if this is a memory limit. I have print statements so I can watch what is going on, and it seems to buffer forever as nothing is printing and up until that specific section, the code is running a couple thousand lines per second. Any ideas as to what is making it stop? Is this memory or something else?

Community
  • 1
  • 1
Ryan Saxe
  • 17,123
  • 23
  • 80
  • 128
  • 1
    You'd have to look at what objects you store in each list, and how much space it takes. `sys.getsizeof` should get you started – inspectorG4dget Jan 13 '14 at 22:15
  • each object in this specific list returns `72` from `sys.getsizeof(object)` – Ryan Saxe Jan 13 '14 at 22:18
  • @inspectorG4dget how can I use this information to determine why this is stopping. I have 4GB of RAM if that makes any difference. – Ryan Saxe Jan 13 '14 at 22:19
  • 1
    @RyanSaxe: What OS are you using? How much memory is still available to processes? – Martijn Pieters Jan 13 '14 at 22:23
  • If the dictionaries all have the same keys, you can save a lot of memory by using namedtuples – John La Rooy Jan 13 '14 at 22:25
  • 2
    @RyanSaxe: the 72 bytes is **just** the dict object, not it's contents. You'll have to get the size of the keys and the values too. – Martijn Pieters Jan 13 '14 at 22:26
  • If it runs using a 64-bit version of Python, then yeah, it's a memory problem. – kindall Jan 13 '14 at 23:19
  • What exactly does "it simply stops" mean? That the process terminates? Or merely that it stops making visible *progress*? The latter is very common when you're pushing your RAM limit, as the OS spends huge gobs of time swapping out other programs to disk (to free up a little more RAM). This *can* go on for - literally - hours before finally getting a `MemoryError` (although that's unusual, it's certainly possible). – Tim Peters Jan 14 '14 at 02:50

1 Answers1

2

If you have that many objects you need to store, you need store them on disk, not in memory. Consider using a database.

If you import the sys module you will have access to the function sys.getsizeof(). You will have to look at each object of the list and for each dictionary compute the value for every key. For more on this see this previous question - In-memory size of a Python structure.

Community
  • 1
  • 1
RyPeck
  • 7,830
  • 3
  • 38
  • 58