I have a problem with spliting a np. array and list into two. Here is my code:
X = []
y = []
for seq, target in ConvertedData:
X.append(seq)
y.append(target)
y = np.vstack(y)
train_x = np.array(X)[:int(len(X) * 0.9)]
train_y = y[:int(len(X) * 0.9)]
validation_x = np.array(X)[int(len(X) * 0.9):]
validation_y = y[int(len(X) * 0.9):]
This is a sample of code that prepares data for neural network. Works great, but generates "out of memory error" (i have 32GB on board):
Traceback (most recent call last):
File "D:/Projects/....Here is a file location.../FileName.py", line 120, in <module>
validation_x = np.array(X)[int(len(X) * 0.9):]
MemoryError
It seems like it keeps in memory list X and np.array y and duplicates it as separate variablest train_x, train_y, validation_x, validation_y. Do you know how to deal with this?
Shape of X:(324000, 256, 24)
Shape of y:(324000,10)
Shape of train_x: (291600, 256, 24)
Shape of train_y:(291600,10)
Shape of validation_x:(32400, 256, 24)
Shape of validation_y:(32400,10)