I'm trying to update a LSTM's weights written in Keras based on a self-defined loss function. I found pytorch has a function torch.optim.Adam(model.parameters())
able to do that. Is there any one like this in keras? Or any good approach with keras can update weights regarding the batch gradient descent on a loss function?
Thanks!!!