9

My model is a simple fully connected network like this:

inp=Input(shape=(10,))
d=Dense(64, activation='relu')(inp)
d=Dense(128,activation='relu')(d)
d=Dense(256,activation='relu')(d)     #want to give input here, layer3
d=Dense(512,activation='relu')(d)
d=Dense(1024,activation='relu')(d)
d=Dense(128,activation='linear')(d)

So, after saving the model I want to give input to layer 3. What I am doing right now is this:

model=load_model('blah.h5')    #above described network
print(temp_input.shape)        #(16,256), which is equal to what I want to give

index=3
intermediate_layer_model = Model(inputs=temp_input,
                                 outputs=model.output)

End_output = intermediate_layer_model.predict(temp_input)

But it isn't working, i.e. I am getting errors like incompatible input, inputs should be tuple etc. The error message is:

raise TypeError('`inputs` should be a list or tuple.') 
TypeError: `inputs` should be a list or tuple.

Is there any way I can pass my own inputs in middle of network and get the output instead of giving an input at the start and getting output from the end? Any help will be highly appreciated.

today
  • 32,602
  • 8
  • 95
  • 115
Asim
  • 1,430
  • 1
  • 22
  • 43
  • Please include the error messages about incompatible input. – Dr. Snoopy Oct 14 '18 at 06:28
  • I have edited the question now, kindly tell me a way around this problem. – Asim Oct 14 '18 at 06:40
  • @Asim The question title and the description are different things: you mentioned in the title that you want to give input to an intermediate layer and get the output of **the model**, whereas in the question description you are trying to get the output of an **intermediate layer** of the model. Decide which one is desired and then please edit your question accordingly. – today Oct 14 '18 at 09:42
  • I was doing wrongly, I have edited the question. I want ti give input to intermediate layer and get output from the end. – Asim Oct 14 '18 at 11:30

4 Answers4

13

First you must learn that in Keras when you apply a layer on an input, a new node is created inside this layer which connects the input and output tensors. Each layer may have multiple nodes connecting different input tensors to their corresponding output tensors. To build a model, these nodes are traversed and a new graph of the model is created which consists all the nodes needed to reach output tensors from input tensors (i.e. which you specify when creating a model: model = Model(inputs=[...], outputs=[...]).

Now you would like to feed an intermediate layer of a model and get the output of the model. Since this is a new data-flow path, we need to create new nodes for each layer corresponding to this new computational graph. We can do it like this:

idx = 3  # index of desired layer
input_shape = model.layers[idx].get_input_shape_at(0) # get the input shape of desired layer
layer_input = Input(shape=input_shape) # a new input tensor to be able to feed the desired layer

# create the new nodes for each layer in the path
x = layer_input
for layer in model.layers[idx:]:
    x = layer(x)

# create the model
new_model = Model(layer_input, x)

Fortunately, your model consists of one-branch and we could simply use a for loop to construct the new model. However, for more complex models it may not be easy to do so and you may need to write more codes to construct the new model.

today
  • 32,602
  • 8
  • 95
  • 115
5

Here is another method for achieving the same result. Initially create a new input layer and then connect it to the lower layers(with weights).

For this purpose, first re-initialize these layers(with same name) and reload the corresponding weights from the parent model using

new_model.load_weights("parent_model.hdf5", by_name=True)

This will load the required weights from the parent model.Just make sure you name your layers properly beforehand.

idx = 3  
input_shape = model.layers[idx].get_input_shape_at(0) layer

new_input = Input(shape=input_shape)

d=Dense(256,activation='relu', name='layer_3')(new_input)
d=Dense(512,activation='relu', name='layer_4'))(d)
d=Dense(1024,activation='relu', name='layer_5'))(d)
d=Dense(128,activation='linear', name='layer_6'))(d)

new_model = Model(new_input, d)
new_model.load_weights("parent_model.hdf5", by_name=True)

This method will work for complex models with multiple inputs or branches.You just need to copy the same code for required layers, connect the new inputs and finally load the corresponding weights.

anilsathyan7
  • 1,423
  • 17
  • 25
0

You can easily use keras.backend.function for this purpose:

import numpy as np
from tensorflow.keras.layers import Input, Dense
from tensorflow.keras.models import Model
from tensorflow.keras import backend as K

inp=Input(shape=(10,))
d=Dense(64, activation='relu')(inp)
d=Dense(128,activation='relu')(d)
d=Dense(256,activation='relu')(d)     #want to give input here, layer3
d=Dense(512,activation='relu')(d)
d=Dense(1024,activation='relu')(d)
d=Dense(128,activation='linear')(d)

model = Model(inp, d)


foo1 = K.function(
    [inp],
    model.layers[2].output
)

foo2 = K.function(
    [model.layers[2].output],
    model.output
)


X = np.random.rand(1, 10)
X_intermediate = foo1([X])
print(np.allclose(foo2([X_intermediate]), model.predict(X)))

Sorry for ugly function naming - do it best)

0

I was having the same problem and the proposed solutions worked for me but I was looking for something more explicit, so here it is for future reference:

d1 = Dense(64, activation='relu')
d2 = Dense(128,activation='relu')
d3 = Dense(256,activation='relu')
d4 = Dense(512,activation='relu')
d5 = Dense(1024,activation='relu')
d6 = Dense(128,activation='linear')

inp = Input(shape=(10,))

x = d1(inp)
x = d2(x)
x = d3(x)
x = d4(x)
x = d5(x)
x = d6(x)

full_model = tf.keras.Model(inp, x)
full_model.summary()

intermediate_input = Input(shape=d3.get_input_shape_at(0)) # get shape at node 0
x = d3(intermediate_input)
x = d4(x)
x = d5(x)
x = d6(x)
partial_model = tf.keras.Model(intermediate_input, x)
partial_model.summary()

Reference: https://keras.io/guides/functional_api/#shared-layers

Mo_
  • 195
  • 1
  • 6