1

I can't get keras.backend.function to work properly. I'm trying to follow this post:

How to calculate prediction uncertainty using Keras?

In this post they create a function f:

f = K.function([model.layers[0].input],[model.layers[-1].output]) #(I actually simplified the function a little bit).

In my neural network I have 3 inputs. When I try to compute f([[3], [23], [0.0]]) I get this error:

InvalidArgumentError: You must feed a value for placeholder tensor 'input_3' with dtype float and shape [?,1]
 [[{{node input_3}} = Placeholder[dtype=DT_FLOAT, shape=[?,1], _device="/job:localhost/replica:0/task:0/device:CPU:0"]

Now I know using [[3], [23], [0.0]] as an input in my model doesn't give me an error during the testing phase. Can anyone tell me where I'm going wrong?

This is what my model looks like if it matters:

home_in = Input(shape=(1,))
away_in = Input(shape=(1,))
time_in = Input(shape = (1,))
embed_home = Embedding(input_dim = in_dim, output_dim = out_dim, input_length = 1)
embed_away = Embedding(input_dim = in_dim, output_dim = out_dim, input_length = 1)
embedding_home = Flatten()(embed_home(home_in))
embedding_away = Flatten()(embed_away(away_in))
keras.backend.set_learning_phase(1) #this will keep dropout on during the testing phase
model_layers = Dense(units=2)\
    (Dropout(0.3)\
    (Dense(units=64, activation = "relu")\
    (Dropout(0.3)\
    (Dense(units=64, activation = "relu")\
    (Dropout(0.3)\
    (Dense(units=64, activation = "relu")\
    (concatenate([embedding_home, embedding_away, time_in]))))))))
model = Model(inputs=[home_in, away_in, time_in], outputs=model_layers)`
today
  • 32,602
  • 8
  • 95
  • 115
ablanch5
  • 300
  • 1
  • 11
  • Thanks for your updated response @today my code works now. I would also like to mention that I switched from keras to the keras library within tensor flow (`tf.keras` )and my code worked immediately without any errors. – ablanch5 Dec 17 '18 at 14:04

1 Answers1

0

The function you have defined is only using one of the input layers (i.e. model.layers[0].input) as its input. Instead, it must use all the inputs so the model could be run. There are inputs and outputs attributes for the model which you can use to include all the inputs and outputs with less verbosity:

f = K.function(model.inputs, model.outputs)

Update: The shape of all the input arrays must be (num_samples, 1). Therefore, you need to pass a list of lists (e.g. [[3]]) instead of a list (e.g. [3]):

outs = f([[[3]], [[23]], [[0.0]]])
today
  • 32,602
  • 8
  • 95
  • 115
  • Thanks for your response! Unfortunately now I get the error: `InvalidArgumentError: ConcatOp : Ranks of all input tensors should match: shape[0] = [1,16] vs. shape[2] = [1] [[{{node concatenate_1/concat}} = ConcatV2[N=3, T=DT_FLOAT, Tidx=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](flatten_1/Reshape, flatten_2/Reshape, _arg_input_3_0_2, concatenate_1/concat/axis)]]"` – ablanch5 Dec 13 '18 at 03:23