0

I recently started learning about neural networks at my university. From another project I have sensor data that is very noisy and I thought NNs and especially LSTMs should be able to do some signal post processing.

I got it working and it is doing a better job in comparison to a simple avg-sliding-window. But it has a similar problem as the sliding window, that in sudden changes it takes some time to normalize.

For example: green: ground-truth, blue: measured-signal, orange: NN output enter image description here

My model looks like:

    model = tf.keras.models.Sequential()
    model.add(LSTM(32, return_sequences=True,
                input_shape=(TIMESTEPS, 1), 
                batch_size=BATCH_SIZE,
                stateful=True))
    model.add(LSTM(32, return_sequences=True, stateful=True))  
    model.add(LSTM(32, stateful=True)) 
    model.add(Dense(1))

    model.compile(loss='mean_squared_error', optimizer='adam', 
                  metrics=['mean_absolute_error', 'mean_squared_error'])

Edit: In my current approach TIMESTEPS is 1 so I can incrementally feed data to it. Basically put a raw value in and get a smoothed value out. From my understanding stateful=True should make it remember previous values.

Is there a way to encourage the LSTM to react to those changes?

Also any tips where I can learn more about signal processing with NNs?

WaeCo
  • 1,155
  • 1
  • 10
  • 21
  • The TIMESTEPS value is important in this case. If it's too low, then you are making it very hard for the net to give accurate predictions. If you look at the problem carefully you can see that it depends just as much on future values as well as past ones. See what happens if you make it higher. – gerwin May 09 '19 at 21:28
  • Also you can try to assign higher penalties to false predictions right after the signal change. Doubt it'll work but worth the try. – gerwin May 09 '19 at 21:29
  • @gerwin I tried with TIMESTEPS=1 and stateful LSTM incremental learning. But also with TIMESTEPS=100 and not stateful wich was a litte bit worse, but trained a lot faster. – WaeCo May 10 '19 at 07:28
  • And the penalty thing? Does it have effect? – gerwin May 10 '19 at 07:38
  • @gerwin I was still researching how to do that. Is it just changing the sample weight? Any advice on that? – WaeCo May 10 '19 at 07:44
  • 1
    You will need a custom loss function, here's an example: https://stackoverflow.com/questions/45961428/make-a-custom-loss-function-in-keras – gerwin May 10 '19 at 07:47

0 Answers0