10

I combine two VGG net in keras together to make classification task. When I run the program, it shows an error:

RuntimeError: The name "predictions" is used 2 times in the model. All layer names should be unique.

I was confused because I only use prediction layer once in my code:

from keras.layers import Dense
import keras
from keras.models import  Model
model1 = keras.applications.vgg16.VGG16(include_top=True, weights='imagenet',
                                input_tensor=None, input_shape=None,
                                pooling=None,
                                classes=1000)
model1.layers.pop()

model2 =  keras.applications.vgg16.VGG16(include_top=True, weights='imagenet',
                                input_tensor=None, input_shape=None,
                                pooling=None,
                                classes=1000)
model2.layers.pop()
for layer in model2.layers:
    layer.name = layer.name + str("two")
model1.summary()
model2.summary()
featureLayer1 = model1.output
featureLayer2 = model2.output
combineFeatureLayer = keras.layers.concatenate([featureLayer1, featureLayer2])
prediction = Dense(1, activation='sigmoid', name='main_output')(combineFeatureLayer)

model = Model(inputs=[model1.input, model2.input], outputs= prediction)
model.summary()

Thanks for @putonspectacles help, I follow his instruction and find some interesting part. If you use model2.layers.pop() and combine the last layer of two models using "model.layers.keras.layers.concatenate([model1.output, model2.output])", you will find that the last layer information is still showed using the model.summary(). But actually they do not exist in the structure. So instead, you can use model.layers.keras.layers.concatenate([model1.layers[-1].output, model2.layers[-1].output]). It looks tricky but it works.. I think it is a problem about synchronization of the log and structure.

stop-cran
  • 4,229
  • 2
  • 30
  • 47
dashenswen
  • 540
  • 2
  • 4
  • 21

4 Answers4

9

First, based on the code you posted you have no layers with a name attribute 'predictions', so this error has nothing to do with your layer Dense layer prediction: i.e:

prediction = Dense(1, activation='sigmoid', 
             name='main_output')(combineFeatureLayer)

The VGG16 model has a Dense layer with name predictions. In particular this line:

x = Dense(classes, activation='softmax', name='predictions')(x)

And since you're using two of these models you have layers with duplicate names.

What you could do is rename the layer in the second model to something other than predictions, maybe predictions_1, like so:

model2 =  keras.applications.vgg16.VGG16(include_top=True, weights='imagenet',
                                input_tensor=None, input_shape=None,
                                pooling=None,
                                classes=1000)

# now change the name of the layer inplace.
model2.get_layer(name='predictions').name='predictions_1'
parsethis
  • 7,998
  • 3
  • 29
  • 31
  • Thank you for the explanation. I print all the layers for both model1 and model2 but there is no layer with name "predictions". I think you mean the last layer of VGG but I have already pop the origin last layer. Could you add explanation for this? – dashenswen Apr 17 '17 at 13:59
  • if you're using the latest keras this link references the layer named 'predictions' https://github.com/fchollet/keras/blob/73bf06fb023a8b37ddf2e2a168bbf920c7a6c766/keras/applications/vgg16.py#L143 – parsethis Apr 17 '17 at 14:06
  • and based on the error you definitely have layers named predictions. Did you try my suggestion? – parsethis Apr 17 '17 at 14:07
  • Hi, your suggestion is right but I pop the last layer before I combine two VGG. But you are right, I just update the solution and edit my problem. You will see that.. – dashenswen Apr 17 '17 at 14:14
  • AttributeError: Can't set the attribute "name", likely because it conflicts with an existing read-only @property of the object. Please choose a different name. – Geoffrey Anderson Feb 13 '20 at 20:42
3

You can change the layer's name in keras, don't use 'tensorflow.python.keras'.

Here is my sample code:

from keras.layers import Dense, concatenate
from keras.applications import vgg16

num_classes = 10

model = vgg16.VGG16(include_top=False, weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')
inp = model.input
out = model.output

model2 = vgg16.VGG16(include_top=False,weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')

for layer in model2.layers:
    layer.name = layer.name + str("_2")

inp2 = model2.input
out2 = model2.output

merged = concatenate([out, out2])
merged = Dense(1024, activation='relu')(merged)
merged = Dense(num_classes, activation='softmax')(merged)

model_fusion = Model([inp, inp2], merged)
model_fusion.summary()
  • I created a model like this but when I add `layer.name = layer.name + str("_2")` to the performance of the second model changes. I do not why? – N.IT Nov 04 '18 at 09:17
  • Confirmed that this approach fails for TF 2.0 when doing `model.get_config()` because the commensurate layer names aren't renamed in `model._network_nodes`. – Nicholas Leonard Dec 18 '19 at 20:58
  • @NicholasLeonard What if you use this mehtod? https://stackoverflow.com/a/57794744/2794625 – Mehrshad Zandigohar Sep 03 '21 at 19:59
0

Example:

# Network for affine transform estimation
affine_transform_estimator = MobileNet(
                            input_tensor=None,
                            input_shape=(config.IMAGE_H // 2, config.IMAGE_W //2, config.N_CHANNELS),
                            alpha=1.0,
                            depth_multiplier=1,
                            include_top=False,
                            weights='imagenet'
                            )
affine_transform_estimator.name = 'affine_transform_estimator'
for layer in affine_transform_estimator.layers:
    layer.name = layer.name + str("_1")

# Network for landmarks regression
landmarks_regressor = MobileNet(
                        input_tensor=None,
                        input_shape=(config.IMAGE_H // 2, config.IMAGE_W // 2, config.N_CHANNELS),
                        alpha=1.0,
                        depth_multiplier=1,
                        include_top=False,
                        weights='imagenet'
                        )
landmarks_regressor.name = 'landmarks_regressor'
for layer in landmarks_regressor.layers:
    layer.name = layer.name + str("_2")

input_image = Input(shape=(config.IMAGE_H, config.IMAGE_W, config.N_CHANNELS))
downsampled_image = MaxPooling2D(pool_size=(2,2))(input_image)
x1 = affine_transform_estimator(downsampled_image)
x2 = landmarks_regressor(downsampled_image)
x3 = add([x1,x2])

model = Model(inputs=input_image, outputs=x3)
optimizer = Adadelta()
model.compile(optimizer=optimizer, loss=mae_loss_masked)
mrgloom
  • 20,061
  • 36
  • 171
  • 301
0

you can use layer_.name instead of layer.name this worked for me

  • 1
    As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jun 07 '23 at 15:03