I have a large body of code, that uses some custom classes and a lot of dictionaries. Many of the classes get dictionaries added as attributes. I am finding that it uses too much memory, especially if I am looping -- even if I manually delete some of the classes and dicts.
What I fear is happening is that the dictionaries are getting deleted, but the objects which they contain persist. I need to refactor the code for better memory management, but as a quick solution I was hoping I could recursively and aggresively delete the dictionaries. How would this be achieved?
Here is an example...
def get_lrg():
return {1: np.zeros((1000,1000,100))}
class H():
def add_lrg(self):
fd = get_lrg()
self.lrg = fd
for cls in ['a', 'b', 'c', 'd']:
exec('{0} = H()'.format(cls) )
exec('{0}.add_lrg()'.format(cls) )
del a
del b
del c
del d
also, play around in Ipython with this:
fd = get_lrg()
fd2 = get_lrg()
F = {1: fd, 2: fd2}
F = {}
F = {1: fd, 2: fd2}
del F[1]
del F
and watch the memory usage of the python application... it doesn't appear to "release" the memory even after the dictionary "F" has been deleted (e.g. no references to the objects). What I find on my machine is that the results are unpredictable. Sometimes it does seem the memory gets flushed, other times it appears to be kept in use.