I have now dealt with numpy array in more detail. You always read that numpy ndarray
use less memory, but if you look at the total memory consumption, the ndarray
is much larger than the list.
in lists we have int objects that are 28 bytes in size, but in numpy array we have numpy.int64
objects that are 32 bytes in size.
So i just don't understand why they say that numpy objects use less memory, because the numpy.int64 objects are four bytes larger than the int objects.
import numpy as np
from sys import getsizeof
def is_iterable(p_object):
try:
iter(p_object)
except TypeError:
return False
return True
def get_total_size(element, size):
if not is_iterable(element):
return size + getsizeof(element)
size = size + getsizeof(element)
for new_element in element:
size = get_total_size(new_element, size)
return size
if __name__ == "__main__":
x_list = list(range(100))
x_array = np.array(x_list)
print("x_list:")
print("A list with object references consumes in memory " + str(getsizeof(x_list)) + " Byte(s)")
print("A list of object references and all objects consumed in memory " + str(get_total_size(x_list, 0)) + " Byte(s)")
print("")
print("Numpy-Array:")
print("A ndarray object references consumes in memory " + str(getsizeof(x_array)) + " Byte(s)")
print("A ndarray of object references and all objects consumed in memory " + str(get_total_size(x_array, 0)) + " Byte(s)")
print("")
print("objecttype", type(x_array[1]), "size in bytes", getsizeof(x_array[1]), )
print("objecttype", type(x_list[1]), "size in bytes", getsizeof(x_list[1]), )
output:
x_list:
A list with object references consumes in memory 1016 Byte(s)
A list of object references and all objects consumed in memory 3812 Byte(s)
Numpy-Array:
A ndarray object references consumes in memory 896 Byte(s)
A ndarray of object references and all objects consumed in memory 4096 Byte(s)
objecttype <class 'numpy.int64'> size in bytes 32
objecttype <class 'int'> size in bytes 28