0

I do have a general question of how Python works. I run Python 2.7 from iPython (inside Spyder IDE) on Ubuntu (or with Anaconda on Windows, I have the same behavior).
I have made a tool which runs over thousands of files.
Even when deleting all variables at the end of my for loops (I've double checked; there is not huge lists or whatever) and even by clearing all the variables at the end of my computations, the memory is fully used by Python. I must "retard the kernel" to unload the memory.

To the maximal point, with lots of files to process, after the RAM is full, it goes to SWAP, and after SWAP is full, it totally freezes the system.

Is there any way to see how much memory does each variable take?

I wonder how Python manage the memory "in the background".
I'm clearly not used to this kind of things. Any clues are appreciated.

swiss_knight
  • 5,787
  • 8
  • 50
  • 92
  • https://stackoverflow.com/questions/32167386/force-garbage-collection-in-python-to-free-memory – Jay Jun 18 '17 at 10:07
  • depending on your problem you can iterate over your files lazily (i.e. only having a minimal amount of data in-memory - e.g.: [this answer](https://stackoverflow.com/a/44607850/4954037)). why not show what your problem really is? – hiro protagonist Jun 18 '17 at 10:07
  • Are you very sure you are closing those files properly? – BoarGules Jun 18 '17 at 10:46
  • yes. What I really don't understand is that even by `reset` the python environment within Spyder, it asks then if I'm sure to delete all variables in scope, I say 'Yes' and memory is still fully used. I'm must restart the kernel within Spyder (i.e. the Python kernel) to free all the memory used by Python. – swiss_knight Jun 18 '17 at 11:01
  • `import gc` and adding: `gc.collect()` after deleting all my variable at the end of my for loop seems to lighten the problem! – swiss_knight Jun 18 '17 at 12:31

0 Answers0