1

Using sys.getsizeof(), I compared memory size of lists containing similar data in different data structures. You can see below that the memory size stayed constant though the underlying data use different amount of memory per this post.

I noticed that even if I put an object of a huge size into this list, the result of sys.getsizeof() remained the same, at a smaller size than the object that the list contains. It looks as though the sys.getsizeof() only includes the size of the list itself while its contents are held elsewhere.

Is there another way where I can recursively search the object to get the true size of it?

>>> from sys import getsizeof as g
>>> a, b, c, d
([1, 2, 3], ['A', 'C', 'CA'], [(1, 'a'), (2, 'c'), (3, 'ca')], {1: 'a', 2: 'c', 3: 'ca'})
>>> g(a), g(b), g(c), g(d)
(80, 80, 80, 232)
mysl
  • 1,003
  • 2
  • 9
  • 19
  • 6
    Because `sys.getsizeof` gives you the size of *only the list object* and not the other objects it has references to. Note, the underlying buffer is an array of PyObject pointers, so 8 bytes per item, plus some padding for efficient appends, plus the standard python object header overhead. – juanpa.arrivillaga Jan 10 '20 at 19:35

0 Answers0