0

This is a follow up to a stackoverflow answer from 2009

How can I explicitly free memory in Python?

Unfortunately (depending on your version and release of Python) some types of objects use "free lists" which are a neat local optimization but may cause memory fragmentation, specifically by making more and more memory "earmarked" for only objects of a certain type and thereby unavailable to the "general fund".

The only really reliable way to ensure that a large but temporary use of memory DOES return all resources to the system when it's done, is to have that use happen in a subprocess, which does the memory-hungry work then terminates. Under such conditions, the operating system WILL do its job, and gladly recycle all the resources the subprocess may have gobbled up. Fortunately, the multiprocessing module makes this kind of operation (which used to be rather a pain) not too bad in modern versions of Python.

In your use case, it seems that the best way for the subprocesses to accumulate some results and yet ensure those results are available to the main process is to use semi-temporary files (by semi-temporary I mean, NOT the kind of files that automatically go away when closed, just ordinary files that you explicitly delete when you're all done with them).

It's been 10 years since that answer, and I am wondering if there is a better way to create some sort of process/subprocess/function/method that releases all of it's memory when completed.

The motivation for this is an issue I am having, where a forloop creates a memory error, despite creating no new variables.

Repeated insertions into sqlite database via sqlalchemy causing memory leak?

It is insertion to a database. I know it's not the database itself that is causing the memory error because when I restart my runtime, the database is still preserved, but the crash doesn't happen until another several hundred iterations of the for loop.

SantoshGupta7
  • 5,607
  • 14
  • 58
  • 116
  • If it's python that's giving the memory error, maybe see if invoking the garbage collector manually (say, every 50 or so iterations) fixes your problem (see [the `gc` package](https://docs.python.org/3/library/gc.html) for info on how to do that). I think it's likely that, while *your* code isn't creating any new variables, the code it's *calling* is - and this might clean that up. If the problem only happens after a few hundred iterations, it could be that the garbage collector simply isn't fast enough. Just a possibility, though – Green Cloak Guy Jun 07 '19 at 16:59
  • I have tried invoking the garbage collector every iteration. I haven't considered every say 50 iterations. Is there a reason why the latter may work better? Or was every 50 to save on speed? – SantoshGupta7 Jun 07 '19 at 17:01
  • Every 50 was to save on speed, yeah. Well, if that didn't help, maybe you can still use the other functions of the `gc` class and a debugger to see if you can isolate *what* is causing the problem, but otherwise I'm out of ideas – Green Cloak Guy Jun 07 '19 at 17:02

0 Answers0