5

I want to freeze training of first two layers of following code after 3rd epoch. Total epoch is set to 10.

model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3),
                 activation='relu',
                 input_shape=input_shape))
model.add(Conv2D(64, (3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(num_classes, activation='softmax'))
Hitesh
  • 1,285
  • 6
  • 20
  • 36

2 Answers2

3

How can I "freeze" Keras layers?

To "freeze" a layer means to exclude it from training, i.e. its weights will never be updated. This is useful in the context of fine-tuning a model or using fixed embeddings for a text input.

You can change the trainable attribute of a layer.

for layer in model.layers[:2]:
    layer.trainable = False

For this to take effect, you will need to call compile() on your model after modifying the trainable property. If you don't you will receive a warning "Discrepancy between trainable weights and collected trainable weights" and all your layers will be still trainable. So:

  • Build and compile the model
  • Train it for 3 epochs
  • Freeze layers you want
  • compile the model again
  • Train the rest epochs
-1

This should work:

for epoch in range(3):
    model.fit(.., epochs=1)

# save the weights of this model
model.save_weights("weight_file.h5") 

# freeze the layers you want
for layer in model.layers[:2]:
    layer.trainable = False

In order to train further with these weights but first two layers frozen, you need to compile the model again.

model.compile(..)

# train further    
for epoch in range(3, 10):
    model.fit(..., epochs=1)
enterML
  • 2,110
  • 4
  • 26
  • 38
  • 2
    It won't work without compiling the model again. You will get the warning after that: "Discrepancy between trainable weights and collected trainable weights". Please see my answer. Secondly why not just model.fit(..., epochs=7) instead that for loop... – Piotr Grzybowski Dec 09 '19 at 10:05