1

I am a new user of Python and Keras and I hope your feedback could guide me to the solution.

I would like to concatenate several recurrent layers to train a multi-inputs neural network. More specifically, I would like to iteratively create multiple layers and merge all of them into a single one.

In order to do that, I create a dictionary of training sets (in this case, two training sets):

 trainX_dict[0].shape
 (11,21,6)

 trainX_dict[1].shape
 (11,21,6)

 trainY_dict[0].shape
 (11,21)

 trainY_dict[1].shape
 (11,21)

Next, I create the recurrent layers and dense layers as follows:

branch_name={}
for i in range(1)
    branch_name[i] = Sequential()
    branch_name[i].add(SimpleRNN(30, input_shape=(trainX_dict[i].shape[0],trainX_dict[i].shape[2])))
    branch_name[i].add(Dense(21))

where:

 branch_name = [<keras.models.Sequential object at 0x7f287421c9e8>, <keras.models.Sequential object at 0x7f2868e77940>]

Now, I would like to merge the two layers into a single one. According to this answer How to concatenate two layers in keras?, this should work:

branch_name[i].add(SimpleRNN(21, input_shape=(trainX_dict[i].shape[0],trainX_dict[i].shape[2])))
branch_name[i].add(Dense(21))
merged=Concatenate(branch_name) 
final_model=Sequential()
final_model.add(Dense(21,input_shape=(trainX_dict[i].shape[0],trainX_dict[i].shape[2])))
merge2 = Concatenate([merged, final_model])
final_model.compile(optimizer='Adam', loss='binary_crossentropy', metrics=['accuracy'])
Community
  • 1
  • 1
seli
  • 67
  • 9
  • It looks like you are tackling multiple challenges at the same time. Why not start with two simple models, solving one challenge and then build on that? To merge two (sequential) models use "merge". The Keras documentation provides a basic example. Then take it from there. – jdelange Mar 07 '17 at 17:38
  • I understand your point and I agree this is something I should improve. However, please note that this code already works if I type the two layers: final_model.add(Merge([layer_0,layer_1], mode='concat')). So that I am already able to merge two sequential models using "merge". The example was in fact taken from the Keras documentation: [link](https://keras.io/getting-started/sequential-model-guide/) – seli Mar 07 '17 at 18:07
  • 1
    Your subject title does not match the multiple questions you are asking. It says you need help to merge dense layers. – jdelange Mar 07 '17 at 18:27
  • Right. I am interested on how to combine multiple inputs in Keras. This post [link](http://datascience.stackexchange.com/questions/13428/what-is-the-significance-of-model-merging-in-keras) says: "[Merge]" is used to join multiple networks together. Here [link](http://www.picnet.com.au/blogs/guido/post/2016/05/16/review-of-keras-deep-learning-core-layers/) says that it merges the output of multiple layers. Based on this, the appropriate question could be "How to automatically combine the output of multiple sequential layers into one in Keras". I will check the documentation. Thanks! – seli Mar 07 '17 at 20:00
  • I don't get it, why don't you simply create your layer list with a for loop and then pass it to merge? Something like l=[] ; for i in range(10): l.append(Dense(10)) and then Merge(l,mode='concat') ? – maz Mar 08 '17 at 04:54
  • Many thanks you for your suggestion. It made me thinking about my overall problem. I am starting to recognize each specific problem I have. I will update the question accordingly. – seli Mar 10 '17 at 10:50

0 Answers0