I was reading e-satis' answer to What does the "yield" keyword do in Python?. And He said:
These iterables are handy because you can read them as much as you wish, but you store all the values in memory and this is not always what you want when you have a lot of values
which I don't really agree with. However I can't comment there.
Then comes this question: Do Python's iterables really store all values in memory?
I used to think so. But I changed my view since I saw Python's detailed documentation yesterday.
>>> import sys
>>> def gen():
... n = 0
... while n < 10:
... yield n
... n += 1
...
>>> a = [0,1,2,3,4,5,6,7,8,9]
>>> b = range(10) # b is a range object, which is a iterable
>>> c = gen(10) # c is a iterator, which is a iterable too
>>> sys.getsizeof(a)
144
>>> sys.getsizeof(b)
48
>>> sys.getsizeof(c)
72
>>> B = list(b)
>>> C = list(c)
>>> a
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
>>> B
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
>>> C
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
>>> sys.getsizeof(B)
200
>>> sys.getsizeof(C)
160
We mistakenly think the iterable stores all values in memory, because we are used to get all values when using an iterable.
Am I right?