1

I'm currently building a CNN that uses transfer learning to classify images. In my model, there is a tensorflow-hub KerasLayer that uses EfficientNet in order to create a feature vector.

My code is here:

model = models.Sequential([
hub.KerasLayer("https://tfhub.dev/google/efficientnet/b7/feature-vector/1", trainable=True), # Trainable
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(NEURONS_PER_LAYER, kernel_regularizer=tf.keras.regularizers.l2(REG_LAMBDA), activation=ACTIVATION),
layers.Dropout(DROPOUT),
layers.Dense(1, activation="sigmoid")])

I can freeze or unfreeze the entire KerasLayer, but I can't seem to find a way to only freeze the earlier layers and fine-tune the higher-level parts. Can anyone help?

Evan Zheng
  • 135
  • 8

1 Answers1

0

You can freeze entire layer by using layer.trainable = False. Just in case you happen to load entire model or create a model from scratch you can do this loop to find specific a layer to freeze.

# load a model or create a model
model = Model(...)

# first you print out your model summary
model.summary()

# you will get something like this
''' 
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
inception_resnet_v2 (Model)  (None, 2, 2, 1536)        54336736  
_________________________________________________________________
flatten_2 (Flatten)          (None, 6144)              0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 6144)              0         
_________________________________________________________________
dense_8 (Dense)              (None, 2048)              12584960  
_________________________________________________________________
dense_9 (Dense)              (None, 1024)              2098176   
_________________________________________________________________
dense_10 (Dense)             (None, 512)               524800    
_________________________________________________________________
dense_11 (Dense)             (None, 17)                8721      
=================================================================
'''

# here is loop for freezing particular layer (dense_10 in this example)
for layer in model.layers:
    # selecting layer by name
    if layer.name == 'dense_10':
        layer.trainable = False

# for that hub layer you need to create hub layer outside your model just for easy access

# my inception layer
inception_layer = keras.applications.InceptionResNetV2(weights='imagenet', include_top=False, input_shape=(128, 128, 3))

# create model
model.add(inception_layer)

# same trick
inception_layer.summary()

# here is same loop from upper example
for layer in inception_layer.layers:
    # selecting layer by name
    if layer.name == 'block8_10_conv':
        layer.trainable = False

Ronakrit W.
  • 693
  • 4
  • 8