1

I can't figure out why function f1 uses the same amount of memory as function f2:

from memory_profiler import memory_usage

class BigObject(object):
    def __init__(self):
        self.value = "a"*1000000

a = []
def p1(n):
    if n == 0:
        return
    a.append(BigObject())
    p1(n-1)
    a.pop()

def p2(n, p):
    if n == 0:
        return
    p2(n-1, p + [BigObject()])

def f1():
    p1(200)
def f2():
    p2(200, [])


mem_usage = memory_usage(f1)
print('Memory usage (in chunks of .1 seconds): %s' % mem_usage)
print('Maximum memory usage of f1: %s' % max(mem_usage))
mem_usage = memory_usage(f2)
print('Memory usage (in chunks of .1 seconds): %s' % mem_usage)
print('Maximum memory usage of f2: %s' % max(mem_usage))

Output:

Memory usage (in chunks of .1 seconds): [28.2421875, 28.55078125, 57.859375, 88.734375, 119.375, 150.7890625, 182.2109375, 213.62890625, 64.45703125, 28.2421875]
Maximum memory usage of f1: 213.62890625
Memory usage (in chunks of .1 seconds): [152.25390625, 152.25390625, 152.25390625, 152.25390625, 152.25390625, 152.25390625, 152.25390625, 177.328125, 209.73046875, 151.296875]
Maximum memory usage of f2: 209.73046875

My thought was that since p2 is continually building temporary lists due to the "+" operator which creates a new list every function call, surely it should be using more memory compared to p1 which only modifies one list, but this was not born out by the observation.

What's going on?

Martijn Pieters
  • 1,048,767
  • 296
  • 4,058
  • 3,343
johnny
  • 19
  • 2
  • 1
    `memoryprofiler()` is useless when measuring Python memory details. It measures overall process memory allocations, which says nothing about how Python uses the memory heap it has control over. – Martijn Pieters Nov 29 '18 at 16:19
  • See the duplicate, where I cover a different memory test and show how `tracemalloc` can actually tell us what is happening. – Martijn Pieters Nov 29 '18 at 16:21
  • 1
    I think the memory profiler works, the reason that the memory usage is the same is because the new lists mostly consist of references to objects in the old lists. – johnny Nov 29 '18 at 16:30
  • 1
    @roganjosh: the default backend is useless for this kind of detailed analysis, yes. – Martijn Pieters Nov 29 '18 at 16:30
  • 1
    @roganjosh: it states in the documentation, under FAQ entry *How accurate are the results*: *This module gets the memory consumption by querying the operating system kernel about the amount of memory the current process has allocated, which might be slightly different from the amount of memory that is actually used by the Python interpreter*. The only word wrong there is 'slightly'. – Martijn Pieters Nov 29 '18 at 16:32
  • The default backend is useful to profile overall memory consumption of the whole process, **not** to analyse the memory use of individual objects within the interpreter. – Martijn Pieters Nov 29 '18 at 16:32
  • @MartijnPieters see, they've clearly gone along the lines of `line_profiler` and that is a fantastic module. I'm glad I came across your answer over in the dupe because sticking that in an FAQ seems a bit... underhanded? – roganjosh Nov 29 '18 at 16:33

0 Answers0