2

According to the documentation, the learningRate is optionnal in the adam optimizer.

If non set what is the default learningRate ?

Is it computed dynamically according to the variables values ?

Julien TASSIN
  • 5,004
  • 1
  • 25
  • 40
  • Possible duplicate of [Tensorflow: Confusion regarding the adam optimizer](https://stackoverflow.com/questions/37842913/tensorflow-confusion-regarding-the-adam-optimizer) – bugs Apr 22 '18 at 09:50

1 Answers1

0

As you can see in the source code which is linked to on the doc page, the default values are (currently):

learningRate = 0.001
beta1 = 0.9
beta2 = 0.999
Sebastian Speitel
  • 7,166
  • 2
  • 19
  • 38