I'm trying to run some iteration which populates a large list until my memory usage reaches 60%, use that list, delete that list, then run it all again. Something like this:
while True:
ls = []
usage = psutil.virtual_memory().percent
print(usage)
if usage < 60:
ls.append(bigThing)
else:
break
use_list(ls)
#seems redundant, but did this just in case
del ls
Regardless of what I did, It seemed like the memory usage converged to around 60. The first time I run the cell it might look like this:
30
35
40
50
62
the next few times I run the cell it would look like this, where it seems like there's a slight reduction that stays static until a threshold is reached
58.7
58.7
58.7
58.7
58.7
60
then it would just go straight to 60.1
, for instance, and never iterate.
If I ran any code with an error, for instance deleting ls
twice in a row, it seemed to reset the memory usage, it would drop back down to 30, and I could iterate again. However, if I did this in a try
block, it would remain the same. This makes me think there's some moving cap that resets when there's an error, but that's why I'm here: how the heck do I keep it from "capping out" like this?