I do an experiment about how much memory each of Python array types spends, which is list
, tuple
, set
, dict
, np.array
. Then i got the following result.
(x axis is the length of array, y axis is the memory size.)
I found that the amount of memory a Python set
spends increases in steps(also dict
), while those of others increase linearly as i expected. I wonder what makes them different.
I used following get_size()
function. (reference)
def get_size(obj, seen = None):
size = sys.getsizeof(obj)
if seen is None:
seen = set()
obj_id = id(obj)
if obj_id in seen:
return 0
seen.add(obj_id)
if isinstance(obj, dict):
size += sum([get_size(v, seen) for v in obj.values()])
size += sum([get_size(k, seen) for k in obj.keys()])
elif hasattr(obj, '__dict__'):
size += get_size(obj.__dict__, seen)
elif hasattr(obj, '__iter__') and not isinstance(obj, (str, bytes, bytearray)):
size += sum([get_size(i, seen) for i in obj])
return size
And i measured the memory from length 0 to 10,000 in 100 intervals.
my code : https://repl.it/repls/WanEsteemedLines