I am looking for python memory debugging techniques? Basically I am looking at tools available for python and see what data we can look at when a python process is taking lot of memory? I am aiming to isolate such memory eating process and dump statistics.
Asked
Active
Viewed 671 times
1 Answers
0
Take a look at guppy
from guppy import hpy
import networkx as nx
h = hpy()
L=[1,2,3]
h.heap()
> Partition of a set of 89849 objects. Total size = 12530016 bytes.
> Index Count % Size % Cumulative % Kind (class / dict of class)
> 0 40337 45 3638400 29 3638400 29 str
> 1 21681 24 1874216 15 5512616 44 tuple
> 2 1435 2 1262344 10 6774960 54 dict (no owner)
This will go a long way towards telling you where lots of memory is being used.
Python clears memory when an object is no longer accessible - nothing in the program points to it. Sometimes you might be wondering what on earth is still pointing to a particular object. For that you might look at this question which tells you how about how to find out what things are still referring to an object.