I have a large multidimentional matrix where one index actually contain the real and imaginary part of a number.
Here is the code I would like to optimize:
import numpy as np
big_matrix = np.random.random((8,160,23,3,23,80)) # 1240M
tmp1 = np.zeros((8,80,23,3,23,80)) # 620M
tmp2 = np.zeros((8,80,23,3,23,80)) # 620M
for ii in np.arange(80):
tmp1[:,ii,:,:,:,:] = big_matrix[:,2*ii,:,:,:,:]
tmp2[:,ii,:,:,:,:] = big_matrix[:,2*ii+1,:,:,:,:]
final_matrix = np.vectorize(complex)(tmp1,tmp2) # 1240M
a = np.sum(final_matrix)
The theoretical memory size for big_matrix should be (8*160*23*3*23*80)*8/(1024**2)=1240MB. So I was expecting a total memory consumption of 3.7 GB. Instead, my memory consumption went up to 11GB. I do not understand why? How can I optimize my program so that it does the same but at a cheaper memory cost?
Thank you,
Sam.