1

I know that getsizeof() is not a reliable indicator of memory, because as the documentation says:

"Only the memory consumption directly attributed to the object is accounted for, not the memory consumption of objects it refers to."

So for a list it doesn't tell you how much RAM is consumed by the objects that those pointers refer to. How can I see how much memory is used by those?

NB: In fine I am trying to compare the memory used by Numpy arrays and lists.

Ma0
  • 15,057
  • 4
  • 35
  • 65
astudentofmaths
  • 1,122
  • 2
  • 19
  • 33
  • Also, almost always will a `numpy.ndarray` take *significantly* less memory than a list. Especially on a 64-bit architecture – juanpa.arrivillaga Dec 06 '17 at 09:31
  • If you want to do a complete comparison, I would suggest using an external tool to profile the execution of 2 pieces of code, one with lists, the other with numpy arrays. Example: pmap (https://stackoverflow.com/a/2816070/1860929). – Anshul Goyal Dec 06 '17 at 09:31
  • @mu無 well, you can guess-timate fairly accurately if you are working with `numpy.ndarray` objects with your usual numeric-types (or even struct-types), same with `list` objects if they hold `float`/`int` objects. If you want to estimate the size of a very-irregular, heterogeneous list (e.g. something that is deserialized from a json response) then it becomes much harder, and profiling is probably the way to go. – juanpa.arrivillaga Dec 06 '17 at 09:44
  • @juanpa.arrivillaga I know that well and this is precisely what I want to prove. And it seems I cannot prove it using `getsizeof()`. From the question you marked that I duplicated, they indeed say the same thing as I said, i.e. that `getsizeof()` is not efficient ("Only the memory consumption directly attributed to the object is accounted for, not the memory consumption of objects it refers to."). But the accepted answer actually uses` getsizeof()`... – astudentofmaths Dec 06 '17 at 09:50
  • 1
    @Arthurim it uses the [recursive sizeof recipe](https://code.activestate.com/recipes/577504/) to find out exactly, but also shows how you can estimate it without it when working with simple types, like `int`, and `str` and `tuple`s using `sys.getsizeof`. – juanpa.arrivillaga Dec 06 '17 at 09:53
  • You don't actually do it for lists. You wrote *"So, just for the container a list of size 1,000,000 will take up roughly 8 million bytes, or 8 megabytes. Building a list with 1000000 entries bears that out:[...]. The extra memory is accounted for by the overhead of a python object, and the extra space that a the underlying array leaves at the end to allow for efficient .append operations."* But your computations for the list yield the 8 m bytes, not the memory used in total... – astudentofmaths Dec 06 '17 at 10:27

0 Answers0