1

If I run a function in Python 3 (func()) is it possible that objects that are created inside func() but cannot be accessed after it has finished would cause it to increase its memory usage?

For instance, will running

def func():
    # Objects being created, that are not able to be used after function call has ended.

while True:
    func()

ever cause the program run out of memory, no matter what is in func()?

If the program is continually using memory, what are some possible things that could be going on in func() to cause it to continue using memory after it has been called?

Edit: I'm only asking about creating objects that can no longer be accessed after the function has ended, so they should be deleted.

Tom Burrows
  • 2,225
  • 2
  • 29
  • 46
  • 10
    Short answer is yes - `non_local_list.append(1)` would be a trivial example. – jonrsharpe Jun 06 '17 at 21:07
  • 2
    Can you explain why you want such behavior? Maybe this site is out of scope... – tupui Jun 06 '17 at 21:08
  • 3
    Another trivial example: `def f():''; f.__doc__ += ' '` – PM 2Ring Jun 06 '17 at 21:14
  • @Y0da, I don't want this behaviour, but I have a function that seems to act like this, and I thought all objects in a function are deleted upon exit of the function. – Tom Burrows Jun 07 '17 at 08:27
  • 3
    In that case, you need to show us a [mcve] that exhibits this behaviour. We can't debug code that we can't run or even see. – PM 2Ring Jun 07 '17 at 08:40
  • 1
    Yes I appreciate that, and I am creating one, but I just wanted to check that it was possible there was a problem first (and I wasn't just going crazy!) – Tom Burrows Jun 07 '17 at 08:41
  • 1
    Python functions are [*not pure*](https://en.wikipedia.org/wiki/Pure_function), they are allowed to touch anything in the Python runtime they can lay their hands on. The Python runtime is inherently introspectable and mutable, so *in general* you can't trust functions to not eat memory. There are numerous ways of making objects grow from a function, too many to recount. Perhaps you want to focus on a specific case instead? – Martijn Pieters Jun 08 '17 at 16:44
  • 1
    @Shiva I don't think this question is too broad because it is straight forward and can be sufficiently answered in a couple paragraphs. I think this question should be reopened. – Uyghur Lives Matter Jun 08 '17 at 17:03
  • @PM2Ring There may be a debugging issue that prompted this question but that's beside the point. I don't think a M.C.V.E. should be required because this question isn't about debugging. – Uyghur Lives Matter Jun 08 '17 at 17:03
  • 1
    @cpburnz As Martijn says, there are numerous ways of making objects grow from a function, so an answer consisting of a couple of paragraphs will be rather general. IMHO. But if the OP can narrow down the scope of the question with a MCVE containing specific code that may cause unexpected memory gobbling, then I'm happy to cast a re-open vote. – PM 2Ring Jun 08 '17 at 17:09
  • @PM2Ring I think you're focusing too much on the potential problem rather than the actual question itself. If the OP wants to know why a specific function doesn't seem to release memory, then he by all means should provide an M.C.V.E. But he's asking more of a general question to which a general answer similar to Martjin's comment would suffice. – Uyghur Lives Matter Jun 08 '17 at 17:19
  • @cpburnz the question that you're suggesting be answered is not the question that is asked here. That question would be: _"Can Python functions access variables that exist outside of the function definition?"_ –  Jun 08 '17 at 18:16
  • I've added an addendum to the bottom that should help to make it less broad. – Tom Burrows Jun 13 '17 at 14:01

1 Answers1

3

Yes, it is possible for a Python function to still use memory after being called.

  • Python uses garbage collection (GC) for memory management. Most GCs (I suppose there could be some exceptions) make no guarantee if or when they will free the memory of unreferenced objects. Say you have a function consume_lots_of_memory() and call it as:

    while True:
        consume_lots_of_memory()
    

    There is no guarantee that all of the memory allocated in the first call to consume_lots_of_memory() will be released before it is called a second time. Ideally the GC would run after the call finished, but it might run half way through the fifth call. So depending on when the GC runs, you could end up consuming more memory than you would expect and possibly even run out of memory.

  • Your function could be modifying global state, and using large amounts of memory that never gets released. Say you have a module level cache, and a function cache_lots_of_objects() called as:

    module_cache = {}
    while True:
        cache_lots_of_objects()
    

    Every call to cache_lots_of_objects() only ever adds to the cache, and the cache just keeps consuming more memory. Even if the GC promptly releases the non-cached objects created in cache_lots_of_objects(), your cache could eventually consume all of your memory.

  • You could be encountering an actual memory leak from Python itself (unlikely but possible), or from a third-party library improperly using the C API, using a leaky C library, or incorrectly interfacing with a C library.

  • One final note about memory usage. Just because Python has freed allocated objects, it does not necessarily mean that the memory will be released from the process and returned to the operating system. The reason has to do with how memory is allocated to a process in chunks (pages). See abarnert's answer to Releasing memory in Python for a better explanation than I can offer.

Uyghur Lives Matter
  • 18,820
  • 42
  • 108
  • 144