11

I am trying to merge two Sequential models In Keras 2.0, using the following line:

merged_model.add(Merge([model1, model2], mode='concat'))

This still works fine, but gives a warning:

"The `Merge` layer is deprecated and will be removed after 08/2017. Use
instead layers from `keras.layers.merge`, e.g. `add`, `concatenate`, etc." 

However, studying the Keras documentation and trying add, Add(), has not resulted in something that works. I have read several posts from people with the same problem, but found no solution that works in my case below. Any suggestions?

model = Sequential()
model1 = Sequential()
model1.add(Dense(300, input_dim=40, activation='relu', name='layer_1'))
model2 = Sequential()
model2.add(Dense(300, input_dim=40, activation='relu', name='layer_2'))
merged_model = Sequential()

merged_model.add(Merge([model1, model2], mode='concat'))

merged_model.add(Dense(1, activation='softmax', name='output_layer'))
merged_model.compile(loss='binary_crossentropy', optimizer='adam', 
metrics=['accuracy'])

checkpoint = ModelCheckpoint('weights.h5', monitor='val_acc',
save_best_only=True, verbose=2)
early_stopping = EarlyStopping(monitor="val_loss", patience=5)

merged_model.fit([x1, x2], y=y, batch_size=384, epochs=200,
             verbose=1, validation_split=0.1, shuffle=True, 
callbacks=[early_stopping, checkpoint])

EDIT: When I tried (as suggested below by Kent Sommer):

from keras.layers.merge import concatenate
merged_model.add(concatenate([model1, model2]))

This was the error message:

Traceback (most recent call last):
  File "/anaconda/lib/python3.6/site- packages/keras/engine/topology.py", line 425, 
in assert_input_compatibility
    K.is_keras_tensor(x)
  File "/anaconda/lib/python3.6/site-
packages/keras/backend/tensorflow_backend.py", line 403, in     is_keras_tensor
    raise ValueError('Unexpectedly found an instance of type `' +
 str(type(x)) + '`. '
ValueError: Unexpectedly found an instance of type 
`<class'keras.models.Sequential'>`. Expected a symbolic tensor instance.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "quoradeeptest_simple1.py", line 78, in <module>
    merged_model.add(concatenate([model1, model2]))
  File "/anaconda/lib/python3.6/site-packages/keras/layers/merge.py",
 line 600, in concatenate return Concatenate(axis=axis, **kwargs)(inputs)
  File "/anaconda/lib/python3.6/site-   packages/keras/engine/topology.py", 
line 558, in __call__self.assert_input_compatibility(inputs)
  File "/anaconda/lib/python3.6/site-packages/keras/engine/topology.py", line 431, 
 in assert_input_compatibility str(inputs) + '.All inputs to the layer '
ValueError: Layer concatenate_1 was called with an input that isn't a
symbolic tensor. Received type: <class 'keras.models.Sequential'>. 
Full input: [<keras.models.Sequential object at 0x140fa7ba8>,
<keras.models.Sequential object at 0x140fabdd8>]. All inputs to the
layer should be tensors.
twhale
  • 725
  • 2
  • 9
  • 25
  • 1
    Possible duplicate of [How to merge keras sequential models with same input?](https://stackoverflow.com/questions/45930844/how-to-merge-keras-sequential-models-with-same-input) – Wilmar van Ommeren Sep 25 '17 at 09:13

3 Answers3

22

What that warning is saying is that instead of using the Merge layer with a specific mode, the different modes have now been split into their own individual layers.

So Merge(mode='concat') is now concatenate(axis=-1).

However, since you want to merge models not layers, this will not work in your case. What you will need to do is use the functional model since this behavior is no longer supported with the basic Sequential model type.

In your case that means the code should be changed to the following:

from keras.layers.merge import concatenate
from keras.models import Model, Sequential
from keras.layers import Dense, Input

model1_in = Input(shape=(27, 27, 1))
model1_out = Dense(300, input_dim=40, activation='relu', name='layer_1')(model1_in)
model1 = Model(model1_in, model1_out)

model2_in = Input(shape=(27, 27, 1))
model2_out = Dense(300, input_dim=40, activation='relu', name='layer_2')(model2_in)
model2 = Model(model2_in, model2_out)


concatenated = concatenate([model1_out, model2_out])
out = Dense(1, activation='softmax', name='output_layer')(concatenated)

merged_model = Model([model1_in, model2_in], out)
merged_model.compile(loss='binary_crossentropy', optimizer='adam', 
metrics=['accuracy'])

checkpoint = ModelCheckpoint('weights.h5', monitor='val_acc',
save_best_only=True, verbose=2)
early_stopping = EarlyStopping(monitor="val_loss", patience=5)

merged_model.fit([x1, x2], y=y, batch_size=384, epochs=200,
             verbose=1, validation_split=0.1, shuffle=True, 
callbacks=[early_stopping, checkpoint])
Kent Sommer
  • 358
  • 3
  • 7
12

try this demo with keras==2.2.4 and tensorflow==1.13.1:

from keras import Sequential, Model
from keras.layers import Embedding, GlobalAveragePooling1D, Dense, concatenate
import numpy as np

model1 = Sequential()
model1.add(Embedding(20, 10, trainable=True))
model1.add(GlobalAveragePooling1D())
model1.add(Dense(1, activation='sigmoid'))
model2 = Sequential()
model2.add(Embedding(20, 10, trainable=True))
model2.add(GlobalAveragePooling1D())
model2.add(Dense(1, activation='sigmoid'))

model_concat = concatenate([model1.output, model2.output], axis=-1)
model_concat = Dense(1, activation='softmax')(model_concat)
model = Model(inputs=[model1.input, model2.input], outputs=model_concat)

model.compile(loss='binary_crossentropy', optimizer='adam')

X_train_1 = np.random.randint(0, 20, (10000, 256))
X_train_2 = np.random.randint(0, 20, (10000, 256))
Y_train = np.random.randint(0, 2, 10000)

model.fit([X_train_1, X_train_2], Y_train, batch_size=1000, epochs=200,
              verbose=True)
James
  • 2,535
  • 1
  • 15
  • 14
  • 2
    Thanks! I went through many solutions on stackoverflow, but this one worked for me. Starting off with an already working script made it easier to modify it to my situation. The best part about this demo is that it doesn't need me to install or download another dataset. – camelCase Oct 20 '19 at 23:31
  • When feeding multiple inputs to the model.fit I face the following error. I Was wondering if you also came across this error. ValueError: Failed to find data adapter that can handle input: ( containing values of types {""}), – prb_cm Feb 21 '21 at 09:41
  • https://stackoverflow.com/questions/68330534/merge-multiple-models-in-keras-tensorflow?noredirect=1#comment120763705_68330534 can you help me with this? – Coder Jul 10 '21 at 21:14
5

Unless you have a good reason to keep the models separated, you can (and should) have the same topology in a single model. Something like:

input1 = Input(shape=(27, 27, 1))
dense1 = Dense(300, activation='relu', name='layer_1')(input1)
input2 = Input(shape=(27, 27, 1))
dense2 = Dense(300, activation='relu', name='layer_2')(input2)
merged = concatenate([dense1, dense2])
out = Dense(1, activation='softmax', name='output_layer')(merged)
model = Model(inputs = [input1, input2], outputs = [out])
ilan
  • 576
  • 3
  • 6
  • Thank you. I tried it and got: ValueError: Error when checking input: expected input_1 to have 4 dimensions, but got array with shape (100, 40). Same as with Kent's suggestion. Have not figured out yet what I am doing wrong. – twhale Sep 25 '17 at 17:27
  • I changed the input shape to model1_in = Input(shape=(40,)) and now it works thank you! – twhale Sep 26 '17 at 01:49