0

InvalidArgumentError: Incompatible shapes: [32,50,1] vs. [32,1]
[[{{node training_7/Adam/gradients/loss_8/time_distributed_5_loss/mean_squared_error/SquaredDifference_grad/BroadcastGradientArgs}}]] [Op:__inference_keras_scratch_graph_53470]

I am using keras to practise. I run into this error after I change the last layer from Dense to TimeDsitributed(Dense)

This is my code:

def generate_time_series(batch_size, n_steps):
    freq1, freq2, offsets1, offsets2 = np.random.rand(4, batch_size, 1)
    time = np.linspace(0, 1, n_steps)
    series = 0.5 * np.sin((time - offsets1) * (freq1 * 10 + 10))  #   wave 1
    series += 0.2 * np.sin((time - offsets2) * (freq2 * 20 + 20)) # + wave 2
    series += 0.1 * (np.random.rand(batch_size, n_steps) - 0.5)   # + noise
    return series[..., np.newaxis].astype(np.float32)

n_steps = 50
series = generate_time_series(10000, n_steps + 1)
X_train, y_train = series[:7000, :n_steps], series[:7000, -1]
X_valid, y_valid = series[7000:9000, :n_steps], series[7000:9000, -1]
X_test, y_test = series[9000:, :n_steps], series[9000:, -1]

model_0 = keras.models.Sequential([
    keras.layers.SimpleRNN(20,return_sequences=True,input_shape=[None,1]),
    keras.layers.SimpleRNN(20,return_sequences=True),
    keras.layers.TimeDistributed(keras.layers.Dense(1))
])

model_0.compile(loss="mse",optimizer="adam")

history_0 = model_0.fit(X_train, y_train, epochs=20,
                validation_data=(X_valid, y_valid))
Lizesh Shakya
  • 2,482
  • 2
  • 18
  • 42
PF32
  • 1
  • 2
  • The expected output of your model is 3 dimension, but your feed data `y_train` is 2 dimension. – giser_yugang Jun 28 '19 at 03:31
  • Thanks @giser_yugang, The code works before I change the last layer of model_0 from ''' keras.layers.Dense(1)''' to '''keras.layers.TimeDistributed(keras.layers.Dense(1))''' Would you pls help explain the reason why? – PF32 Jun 28 '19 at 04:21
  • You need to know [What is the role of TimeDistributed layer in Keras?](https://stackoverflow.com/questions/47305618/what-is-the-role-of-timedistributed-layer-in-keras) and [TimeDistributed(Dense) vs Dense in Keras - Same number of parameters](https://stackoverflow.com/questions/44611006/timedistributeddense-vs-dense-in-keras-same-number-of-parameters). Not sure about your keras version. There is no difference between the two usage in the latest keras. You can use `print(model_0.summary())` to look up the model structure. – giser_yugang Jun 28 '19 at 06:19
  • Thanks so much for the help. I get to know it very well. – PF32 Jun 28 '19 at 07:05

0 Answers0