1

I am trying to fit a Sequential Model with LSTM-Layer on the ECG Dataset from MITDB. The model shall categorize sequences of single Values to 6 Categories. But it wont get acceptable results, only Accuracy of 23 % and not increasing. I think there is something wrong with my Data, or with the shapes. my labels have 6 different Values, are shaped (28154, 6) and look like this:

[[0. 0. 1. 0. 0. 0.]
 [0. 0. 0. 0. 0. 1.]
 [0. 0. 0. 0. 1. 0.]
 ...
 [0. 0. 0. 0. 0. 1.]
 [0. 0. 1. 0. 0. 0.]
 [0. 0. 0. 0. 0. 1.]]

my input data is shaped (batches: 28154, timesteps: 130, features: 1) train

[[[ 987.17507661]
  [ 981.62460783]
  [ 980.96795945]
  ...
  [1014.70828142]
  [1005.45570561]
  [ 998.81239015]]

 ...

 [[1033.07894773]
  [1041.05725021]
  [1039.21188431]
  ...
  [1026.80583104]
  [1025.28301436]
  [1017.54051951]]]

and the model looks like this:

LSTMModel = keras.Sequential()
LSTMLayer = LSTM(100, dropout=0.2, implementation=1, input_shape=(130,1))

LSTMModel.add(LSTMLayer)
LSTMModel.add(tf.keras.layers.Dense(6, activation='softmax')) 
LSTMModel.compile(optimizer='adam', loss=tf.keras.losses.BinaryCrossentropy(from_logits=True) , metrics=['accuracy']) 
history = LSTMModel.fit(train_x, train_y, batch_size=1, epochs=1)

The time-dependencies are inside the sequences of 130 Values. so i have to reset the LSTM States after each of the 28154 Sequences, so Batchsize = 1. I have only one feature. the sequences are time-independent, because they belong to different people. I know, this model is not very optimized, but it wont learn anything.

What am i doing wrong?

Edit: As supposed I tried the loss='categorical_crossentropy', but with no better result. The accuracy is 22,2 %, so it does not learn at all. When picking random label, I would get an Accuracy of 1/6 = 16,6 %. So I dont think, its the Loss Function, because the model doesnt work at all. Maybe the little learning comes from the Dense-Layer.

>>>print(history.history['accuracy'])
[0.22206436097621918]
DerHelge
  • 53
  • 6
  • Can you try using loss=tf.keras.losses.CategoricalCrossentropy() instead? You seem to have 6 classes, not 2. Also, if you're providing a softmax it has to be`from_logits=False` (True is for a sigmoid output), – runDOSrun Sep 11 '20 at 10:43
  • Thanks for the hint and the quick answer! there you got a point. But i also tried squared hinge and other loss functions, but unfortunately that didnt fix the problem. BinaryCrossentropy was only my last try. – DerHelge Sep 11 '20 at 13:01
  • There can be other reasons behind it not working well but you should first make sure that the loss you're using is the correct one. It's not an arbitrary choice. See e.g. [this](https://stackoverflow.com/questions/45741878/using-binary-crossentropy-loss-in-keras-tensorflow-backend) or [this](https://stackoverflow.com/questions/57253841/from-logits-true-and-from-logits-false-get-different-training-result-for-tf-loss) – runDOSrun Sep 11 '20 at 13:14

0 Answers0