0

I'm trying to run some iteration which populates a large list until my memory usage reaches 60%, use that list, delete that list, then run it all again. Something like this:

while True:
    ls = []
    usage = psutil.virtual_memory().percent
    print(usage)
    if usage  < 60:
         ls.append(bigThing)
    else:
         break
    use_list(ls)

    #seems redundant, but did this just in case
    del ls

Regardless of what I did, It seemed like the memory usage converged to around 60. The first time I run the cell it might look like this:

30
35
40
50
62

the next few times I run the cell it would look like this, where it seems like there's a slight reduction that stays static until a threshold is reached

58.7
58.7
58.7
58.7
58.7
60

then it would just go straight to 60.1, for instance, and never iterate.

If I ran any code with an error, for instance deleting ls twice in a row, it seemed to reset the memory usage, it would drop back down to 30, and I could iterate again. However, if I did this in a try block, it would remain the same. This makes me think there's some moving cap that resets when there's an error, but that's why I'm here: how the heck do I keep it from "capping out" like this?

Warlax56
  • 1,170
  • 5
  • 30
  • (the question was why `del` was where it was) The point of this is to try to delete ls before another iteration, so that the value of `usage` will be lower on the next iteration, but as mentioned in the question, it stays (I think superficially) high – Warlax56 Nov 04 '20 at 06:16

1 Answers1

0

Based on this question, the python garbage collector won't automatically free up unreferenced memory immediately. It might depend on your use case, but for me, using import gc and gc.collect() in certain points, to manually command the garbage collector to clean up the data, worked well. So my current program looks kind of like this:

while True:
   ls = make_list_to_mem_thresh()
   use_list(ls)
   del ls
   gc.collect()

It's likely that, when a cell fails, it triggers the garbage collector. If that is the case, that would explain the "capping" phenomenon described in the question.

Warlax56
  • 1,170
  • 5
  • 30