This post should give you a very good idea. Basically, the python will have some free lists memory and when they are used up the memory is then overhead. For example:
import gc
import memory_profiler
print memory_profiler.memory_usage()[0]
x = [i for i in range(10000)]
print memory_profiler.memory_usage()[0]
x = None
print memory_profiler.memory_usage()[0]
gc.collect()
print memory_profiler.memory_usage()[0]
Output:
7.28515625
7.796875
7.796875
7.796875
But when I ran your code with a really huge list, the results were different, code:
import gc
import memory_profiler
print memory_profiler.memory_usage()[0]
x = [i for i in range(10000000)]
print memory_profiler.memory_usage()[0]
x = None
print memory_profiler.memory_usage()[0]
gc.collect()
print memory_profiler.memory_usage()[0]
Output:
7.3515625
387.31640625
311.30859375
94.7890625
So if everything I said is true, if it is truely causing overhead after eating up free python list memory; let's try releasing the memory similar to this post:
import gc
import memory_profiler
def release_list(a):
del a[:]
del a
print memory_profiler.memory_usage()[0]
x = [i for i in range(10000000)]
release_list(x)
print memory_profiler.memory_usage()[0]
x = None
print memory_profiler.memory_usage()[0]
gc.collect()
print memory_profiler.memory_usage()[0]
Output:
7.34765625
318.3359375
318.3359375
96.3359375
Clearly, when you assign x = None it releases the extra burden you have initiated with really big list. Usually in real world the python free list memory should suffice users requirements and would make no difference.
Additional resources:
http://deeplearning.net/software/theano/tutorial/python-memory-management.html
What is the value of None in memory?