I am running a convolutional neural network on input set that is 5GB
, and training output is same size, so total of 10GB of data. I am reserving about 50GB
of memory, but still getting memory issues. I am using adam optimizer, my model fitting looks like this:
cnn_model.fit(x_train,y_train, validation_data=(x_test,y_test), callbacks=[earlystopper], epochs=25)
Any idea how I can improve the situation? Someone here https://github.com/tensorflow/tensorflow/issues/18736 says "Adam and RMSProp are problematic because they memorize historical gradients", any ideas how to address that?