1

I want LSTM to learn with newer data. It needs to update itself depending on the trend in the new data and I wish to save this say in a file. Then I wish to call this pre-fed training file into any other X,Y,Z fresh files where testing is done. So I wish to 're-fit' [update NOT re-train] the model with new data such that model parameters are just updated and not re-initialized. I understand this is online learning but how to implement it through Keras? Can someone please advise how to successfully implement it?

  • You should be looking at the idea called "transfer learning". For what you want, it seems you want to take a model that has been trained on some data, meaning you want optimization algorithms to find the best weights for that data. Then you want to "freeze" (look at that term) the weights of that model and use that model to give you predictions on NEWER/different data while not training (altering the weights) on said data. – John Stud Jan 20 '21 at 03:34
  • @JohnStud Thank you. Per your suggestion, I checked this resource https://machinelearningmastery.com/transfer-learning-for-deep-learning/. However my data is random data NOT image data, so what library I have to import? I am addressing this with plain LSTM and Keras. – Math Researcher Jan 20 '21 at 09:37
  • There is no library to import, Keras can do all this. Your steps are roughly: 1) train the LSTM on your data, 2) save the model "weights", 3) load new data, 4) freeze the weights on your model (probably done in the optimizer (I use torch), 5) generate predictions on the new test data. – John Stud Jan 20 '21 at 15:23
  • @JohnStud Many thanks. I will check the literature. Any github link you can point me to please? – Math Researcher Jan 20 '21 at 17:12

0 Answers0