I recently came across this great stackoverflow post explaining the concept of yield What does the "yield" keyword do in Python?.
As such I did some exploring myself and found that the size of a generator, although substantially smaller than that of an iterable (list in this case) that the generator still holds a very small chunk of memory.
Whats more interesting to me is if i increase the range(100) -> range(10000), and add in intermediate variables e.g. test = i * 100 + 2 ... yield test
, that the memory usage of b
remains a constant 112 bytes, while that of a
increases as I would expect. Is this memory independent of the size of the function referenced for delayed execution in the generator?
from sys import getsizeof
a = [i for i in range(100)]
def b():
for i in range(100):
yield i
b = b()
getsizeof(a) # 920
getsizeof(b) # 112
So the question is - what does b
's memory actually hold?