I would like to remove the first N layers from the pretrained Keras model. For example, an EfficientNetB0
, whose first 3 layers are responsible only for preprocessing:
import tensorflow as tf
efinet = tf.keras.applications.EfficientNetB0(weights=None, include_top=True)
print(efinet.layers[:3])
# [<tensorflow.python.keras.engine.input_layer.InputLayer at 0x7fa9a870e4d0>,
# <tensorflow.python.keras.layers.preprocessing.image_preprocessing.Rescaling at 0x7fa9a61343d0>,
# <tensorflow.python.keras.layers.preprocessing.normalization.Normalization at 0x7fa9a60d21d0>]
As M.Innat mentioned, the first layer is an Input Layer
, which should be either spared or re-attached. I would like to remove those layers, but simple approach like this throws error:
cut_input_model = return tf.keras.Model(
inputs=[efinet.layers[3].input],
outputs=efinet.outputs
)
This will result in:
ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(...)
What would be the recommended way to do this?