0

hi i have the following code

model = tf.keras.Sequential()
model.add(tf.keras.layers.LSTM(64, return_sequences=True, recurrent_regularizer=l2(0.0015), 
input_shape=(timesteps, input_dim)))
model.add(tf.keras.layers.LSTM(64, return_sequences=True, recurrent_regularizer=l2(0.0015), 
input_shape=(timesteps, input_dim)))
model.add(tf.keras.layers.Dense(64, activation='relu'))
model.add(tf.keras.layers.Dense(n_classes, activation='softmax'))

model.summary()

model.compile(optimizer=Adam(learning_rate = 0.0025), loss = 'sparse_categorical_crossentropy', 
metrics = ['accuracy'])

model.fit(X_train, y_train, batch_size=64, epochs=10),

why do I get this error

InvalidArgumentError:  assertion failed: [Condition x == y did not hold element-wise:] [x 
(sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/Shape_1:0) = ] [64 1] [y 
(sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/strided_slice:0) = ] [64 100]
 [[node 
sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/assert_equal_1/Assert/Assert 
(defined at <ipython-input-37-7217e69c04b0>:15) ]] [Op:__inference_train_function_22603]

 Function call stack:
   train_function

when I run the following the code executes fine

model = Sequential()
model.add(LSTM(64, return_sequences=True, recurrent_regularizer=l2(0.0015), 
input_shape=(timesteps, input_dim)))
model.add(Dropout(0.5))
model.add(LSTM(64, recurrent_regularizer=l2(0.0015), input_shape= 
(timesteps,input_dim)))


model.add(Dense(64, activation='relu'))
model.add(Dense(64, activation='relu'))

model.add(Dense(n_classes, activation='softmax'))
model.summary()

model.compile(optimizer=Adam(learning_rate = 0.0025), loss = 
'sparse_categorical_crossentropy', metrics = ['accuracy'])

model.fit(X_train, y_train, batch_size=32, epochs=1)

i am using tensorflow 2.2.0. why then is the error raised. Has to do with the batch size?

loutsi
  • 35
  • 6
  • Does this answer your question? [TensorFlow 2.0 \[Condition x == y did not hold element-wise:\]](https://stackoverflow.com/questions/58609967/tensorflow-2-0-condition-x-y-did-not-hold-element-wise) – Anunay Sep 17 '20 at 10:44
  • no i will edit my question – loutsi Sep 17 '20 at 11:14
  • No it has nothing to do with batch size, the output shape of your model changes if you set return_sequences in the last LSTM to True or False, you can see this difference by looking at the output of model.summary() – Dr. Snoopy Sep 17 '20 at 12:31
  • @Dr.Snoopy you are right thank you – loutsi Sep 17 '20 at 12:43

0 Answers0