I am using optimizer.get_config()
to get the final state of my adam optimizer (as in https://stackoverflow.com/a/60077159/607528) however .get_config()
is returning the initial state. I assume this means one of the following
.get_config()
is supposed to return the initial state- my optimizer is not updating because I've set something up wrong
- my optimizer is not updating tf's adam is broken (highly unlikely)
- my optimizer is updating but is being reset somewhere before I call
.get_config()
- something else?
Of course I originally noticed the issue in a proper project with training and validation sets etc, but here is a really simple snippet that seems to reproduce the issue:
import tensorflow as tf
import numpy as np
x=np.random.rand(100)
y=(x*3).round()
model = tf.keras.models.Sequential([
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(x, y, epochs=500)
model.evaluate(x, y)
model.optimizer.get_config()