I have a program that uses and creates very large numpy arrays. However, garbage collection doesn't seem to be releasing the memory of these arrays.
Take the following script as an example, where it says big
is not tracked (aka is_tracked
is False
).
big = np.ones((100, 100, 100))
print(gc.is_tracked(big))
big2=big+1
get_referrers
etc is redundant in this case returning empty lists, since it's not tracked.
memory_profiler
confirms this with the following output. Showing that neither del
(as expected) or gc.collect()
are freeing memory.
Line # Mem usage Increment Occurrences Line Contents
=============================================================
5 212.3 MiB 212.3 MiB 1 @mprof
6 def main():
7 212.3 MiB 0.0 MiB 1 size = 100
8 242.8 MiB 30.5 MiB 1 big = np.ones((size, size, size))
9 242.8 MiB 0.0 MiB 1 print(gc.is_tracked(big))
10 273.3 MiB 30.5 MiB 1 big2=big+1
11 273.3 MiB 0.0 MiB 1 del big
12 273.3 MiB 0.0 MiB 1 gc.collect()
13 273.3 MiB 0.0 MiB 1 big2
From this thread it implies that they should be tracked and collected. Anyone know why these arrays aren't being tracked for my configuration and how to make sure they are / manually free up the memory?
I'm on MacOs, using the latest versions of numpy-1.22.1 and python3.9.
Many thanks in advance.