0

I'm trying to update a LSTM's weights written in Keras based on a self-defined loss function. I found pytorch has a function torch.optim.Adam(model.parameters()) able to do that. Is there any one like this in keras? Or any good approach with keras can update weights regarding the batch gradient descent on a loss function?

Thanks!!!

beepretty
  • 1,075
  • 3
  • 14
  • 20
  • Is this https://stackoverflow.com/questions/46858016/keras-custom-loss-function-to-pass-arguments-other-than-y-true-and-y-pred what you're looking for? – Mark Loyman Aug 13 '18 at 19:13
  • Thank you, @MarkLoyman! I wasn't clear about my question above. The loss function I defined contains other reward values beside the predict_y and actual_y, eventually, I figured out that I have to use the Keras backend function or tensorflow directly. – beepretty Aug 14 '18 at 17:11

0 Answers0