I am a new user of Python and Keras and I hope your feedback could guide me to the solution.
I would like to concatenate several recurrent layers to train a multi-inputs neural network. More specifically, I would like to iteratively create multiple layers and merge all of them into a single one.
In order to do that, I create a dictionary of training sets (in this case, two training sets):
trainX_dict[0].shape
(11,21,6)
trainX_dict[1].shape
(11,21,6)
trainY_dict[0].shape
(11,21)
trainY_dict[1].shape
(11,21)
Next, I create the recurrent layers and dense layers as follows:
branch_name={}
for i in range(1)
branch_name[i] = Sequential()
branch_name[i].add(SimpleRNN(30, input_shape=(trainX_dict[i].shape[0],trainX_dict[i].shape[2])))
branch_name[i].add(Dense(21))
where:
branch_name = [<keras.models.Sequential object at 0x7f287421c9e8>, <keras.models.Sequential object at 0x7f2868e77940>]
Now, I would like to merge the two layers into a single one. According to this answer How to concatenate two layers in keras?, this should work:
branch_name[i].add(SimpleRNN(21, input_shape=(trainX_dict[i].shape[0],trainX_dict[i].shape[2])))
branch_name[i].add(Dense(21))
merged=Concatenate(branch_name)
final_model=Sequential()
final_model.add(Dense(21,input_shape=(trainX_dict[i].shape[0],trainX_dict[i].shape[2])))
merge2 = Concatenate([merged, final_model])
final_model.compile(optimizer='Adam', loss='binary_crossentropy', metrics=['accuracy'])