I do have a general question of how Python works.
I run Python 2.7 from iPython (inside Spyder IDE) on Ubuntu (or with Anaconda on Windows, I have the same behavior).
I have made a tool which runs over thousands of files.
Even when deleting all variables at the end of my for loops (I've double checked; there is not huge lists or whatever) and even by clearing all the variables at the end of my computations, the memory is fully used by Python. I must "retard the kernel" to unload the memory.
To the maximal point, with lots of files to process, after the RAM is full, it goes to SWAP, and after SWAP is full, it totally freezes the system.
Is there any way to see how much memory does each variable take?
I wonder how Python manage the memory "in the background".
I'm clearly not used to this kind of things. Any clues are appreciated.