0

I am trying to freeze the transferred model and then clone it to another model and then I tried to remove the last layer and add a new layer and train it on my data but an error keeps emerging: this is my function:

def transfered_model_prep(new_model):
  new_model.trainable=False
  cloned_model=keras.models.clone_model(new_model)
  cloned_model.set_weights(new_model.get_weights())
  cloned_model.trainable=False
  cloned_model1 = keras.models.Sequential(cloned_model.layers[:-1])
  
  model2=cloned_model1.add((keras.layers.Dense(1, activation='sigmoid')(cloned_model.layers[0])))
  
  model2.compile(loss="binary_crossentropy",
                     optimizer="adam",
                     metrics=["accuracy"])
 
  
  
  return model2

and this error keeps popping when i activate the function:

ValueError: Exception encountered when calling layer "add" (type Add).

A merge layer should be called on a list of inputs. Received: inputs=Tensor("Placeholder:0", shape=(None, 32, 32, 128), dtype=float32) (not a list of tensors)

Call arguments received:
  • inputs=tf.Tensor(shape=(None, 32, 32, 128), dtype=float32)
Christoph Rackwitz
  • 11,317
  • 4
  • 27
  • 36
  • You are mixing the use of Functional and Sequential APIs in a way that makes no sense. – Dr. Snoopy Jul 17 '22 at 22:43
  • 2
    Particularly this line makes no sense: cloned_model1.add((keras.layers.Dense(1, activation='sigmoid')(cloned_model.layers[0]))) Also note that not all pre-trained models are Sequential, for example ResNet and DenseNet cannot be implemented as Sequential models. – Dr. Snoopy Jul 17 '22 at 22:55
  • @Dr.Snoopy, yes the line doesn't make sense as its a desperate move to fix the problem and add a layer to the model, do you have an idea how i can add a layer to the model? – Baraa najjar Jul 18 '22 at 08:26
  • First you should tell us what model is this? – Dr. Snoopy Jul 18 '22 at 08:32
  • @Dr.Snoopy, i know for sure that its not a famous pretrained model like xception or efficiant net, but looking at its shape and taking into considration that there are wide paths its a functional network – Baraa najjar Jul 18 '22 at 08:39
  • Then my second comment applies, you cannot use the Sequential API for this, use the Functional one. – Dr. Snoopy Jul 18 '22 at 08:45
  • @Dr.Snoopy, but how i can add layers into functional network+ i know that you can put the functional network as a part of sequential network, but thanks anyway for answering already – Baraa najjar Jul 18 '22 at 08:46
  • See the duplicate answer, it is always best to search this site, it is very likely that the question has already been answered. – Dr. Snoopy Jul 18 '22 at 09:08

0 Answers0