According to the documentation, the learningRate is optionnal in the adam optimizer.
If non set what is the default learningRate ?
Is it computed dynamically according to the variables values ?
According to the documentation, the learningRate is optionnal in the adam optimizer.
If non set what is the default learningRate ?
Is it computed dynamically according to the variables values ?
As you can see in the source code which is linked to on the doc page, the default values are (currently):
learningRate = 0.001
beta1 = 0.9
beta2 = 0.999