4

I'm working on Timeseries sequence prediction using LSTM. My goal is to use window of 25 past values in order to generate a prediction for the next 25 values. I'm doing that recursively: I use 25 known values to predict the next value. Append that value as know value then shift the 25 values and predict the next one again until i have 25 new generated values (or more)

I'm using "Keras" to implement the RNN Architecture:

regressor = Sequential()
regressor.add(LSTM(units = 50, return_sequences = True, input_shape = (X_train.shape[1], 1)))
regressor.add(Dropout(0.1))
regressor.add(LSTM(units = 50, return_sequences = True))
regressor.add(Dropout(0.1))
regressor.add(LSTM(units = 50))
regressor.add(Dropout(0.1))
regressor.add(Dense(units = 1))
regressor.compile(optimizer = 'rmsprop', loss = 'mean_squared_error')
regressor.fit(X_train, y_train, epochs = 10, batch_size = 32)

Problem: Recursive prediction always converge to the some value no matter what sequence comes before.

For sure this is not what I want, I was expecting that the generated sequence will be different depending on what I have before and I'm wondering if someone have an idea about this behavior and how to avoid it. Maybe I'm doing something wrong ...

exemple1 exemple2

I tried different epochs number and didn't help much, actually more epochs made it worse. Changing Batch Size, Number of Units , Number of Layers , and window size didn't help too in avoiding this issue.

I'm using MinMaxScaler for the data.

Edit:

scaling new inputs for testing:

dataset_test = sc.transform(dataset_test.reshape(-1, 1))
user3375448
  • 695
  • 6
  • 14

0 Answers0