I have a 3 dimensional dataset of audio files where X.shape
is (329,20,85)
. I want to have a simpl bare-bones model running, so please don't nitpick and address only the issue at hand. Here is the code:
model = tf.keras.models.Sequential()
model.add(tf.keras.layers.LSTM(32, return_sequences=True, stateful=False, input_shape = (20,85,1)))
model.add(tf.keras.layers.LSTM(20))
model.add(tf.keras.layers.Dense(nb_classes, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=["accuracy"])
model.summary()
print("Train...")
model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=50, validation_data=(X_test, y_test))
But then I had the error mentioned in the title:
ValueError: Shapes (None, 1) and (None, 3) are incompatible
Here is the model.summary()
Model: "sequential_13"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_21 (LSTM) (None, 20, 32) 15104
_________________________________________________________________
lstm_22 (LSTM) (None, 20) 4240
_________________________________________________________________
dense_8 (Dense) (None, 3) 63
=================================================================
Total params: 19,407
Trainable params: 19,407
Non-trainable params: 0
_________________________________________________________________
Train...
For this, I followed this post and updated Tensorflow to the latest version, but the issue persists. This post is completely unrelated and highly unreliable.This post although a bit relatable is unanswered for a while now.
Update 1.0:
I strongly think the problem has something to do with the final Dense
layer where I pass nb_classes as 3, since I am classifying for 3 categories in y
.
So I changed the Dense
layer's nb_classes
to 1, which ran the model and gives me this output, which I am positive is wrong.
Train...
9/9 [==============================] - 2s 177ms/step - loss: 0.0000e+00 - accuracy: 0.1520 - val_loss: 0.0000e+00 - val_accuracy: 0.3418
<tensorflow.python.keras.callbacks.History at 0x7f50f1dcebe0>
Update 2.0:
I one hot encoded the y
s and resolved the shape issue. But now the above output with <tensorflow.python.keras.callbacks.History at 0x7f50f1dcebe0>
persists. Any help with this? Or should I post a new question for this? Thanks for all the help.
How should I proceed, or what should I be changing?