0

I saved an RNN (GRU) model using model.save but when I run the fit function after I load the model, it messes up my weights and gives incorrect predictions. However, I get the correct predictions when I predict without running the fit function.

opt = Adam(lr=0.0001, beta_1=0.9, beta_2=0.999, decay=0.01)
rnn_model.compile(loss='binary_crossentropy', optimizer=opt, metrics=["accuracy"])
rnn_model.save('./models/my_model.h5')

#This predicts correctly
model = load_model('my_model.h5')
model.predict(x)

#This does NOT predict correctly
model=load_model('my_model.h5')
model.fit(X, Y, batch_size = 5, epochs=1)
model.predict(x)

Update (workaround found): I haven't figured out the root of the problem. But it seems that the model that I was loading was saved on Keras 2.0.6 and I am loading it on to Keras 2.1.5. Something with the "save_weights" and "load_weights" functions was not working, so I had to load the weights layer by layer on an architecture I built from scratch manually (loading the architecture from the saved model using json worked as well):

for layer_loaded, layer_built in zip(loaded_model,built_model): layer_built.set_weights(layer_loaded.get_weights())

JSRambal
  • 43
  • 1
  • 9
  • Try to run model.compile again after loading the model – Mohamed Elzarei Mar 20 '18 at 01:04
  • 1
    @MohamedElzarei I am trying to avoid that because I don't want to change the optimizer parameters such as the learning rate (which is different from the initial learning rate due to decay). – JSRambal Mar 20 '18 at 01:50
  • Possible duplicate of [Loading a trained Keras model and continue training](https://stackoverflow.com/questions/42666046/loading-a-trained-keras-model-and-continue-training) – Mohamed Elzarei Mar 20 '18 at 02:04
  • @MohamedElzarei Sorry, I don't understand--which of the solutions is applicable to mine? – JSRambal Mar 20 '18 at 02:18
  • @MohamedElzarei I tried the solution of using callbacks but that did not work well.. I'm so confused. – JSRambal Mar 20 '18 at 03:22

1 Answers1

0

It looks very much like the problem with optimizer. For example if learning rate is not saved properly at the value it ended up during last training, and it get initiated again with bigger value. In this case it might lead to messing up the weights when you run fit.

Andrey Kite Gorin
  • 1,030
  • 1
  • 9
  • 23