I have the following code:
def create_keras_model(num_classes):
"""
This function compiles and returns a Keras model.
Should be passed to KerasClassifier in the Keras scikit-learn API.
"""
input_shape = (28, 28, 1)
x_in = keras.Input(shape=input_shape)
x = layers.Conv2D(32, kernel_size=(3, 3), activation="relu")(x_in)
x = layers.Dropout(0.25)(x,training=True)
x = layers.MaxPool2D(pool_size=(2, 2))(x)
x = layers.Conv2D(64, kernel_size=(3, 3), activation="relu")(x)
x = layers.Dropout(0.25)(x,training=True)
x = layers.MaxPool2D(pool_size=(2, 2))(x)
x = layers.Flatten()(x)
x = layers.Dropout(0.5)(x,training=True)
x = layers.Dense(num_classes)(x)
model = Model(inputs=x_in, outputs=x)
model.compile(loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
return model
I need training=True for my purposes. However, after that purpose I need training=False in the Dropout-Layers. Is there a way to achieve that easily?
One way would be to load the model weights and load them into a second model that does not have any Dropout-Layer in the first place, but this seems over complicated.
Setting "trainable=False" like:
model.layers[-2].training = False
model.layers[-5].training = False
model.layers[-8].training = False
does not work. Calling predict several times on the same input data still yields different results.