1

I am new to deep learning and LSTM (with keras). I am trying to solve a multistep ahead time series prediction. I have 3 time series: A, B and C and I want to predict the values of C. I am training an LSTM feeding 3 steps back data points to predict the next 3 steps in the future. The input data looks like:

X = [[[A0, B0, C0],[A1, B1, C1],[A2, B2, C2]],[[ ...]]]

with dimensions: (1000, 3, 3). The output is:

y = [[C3, C4, C5],[C4, C5, C6],...]

with dimensions: (1000, 3).

I am using a simple LSTM with 1 hidden layer (50 neurons). I set up an LSTM with keras as:

n_features = 3
neurons = 50
ahead = 3
model = Sequential()
model.add(LSTM(input_dim=n_features, output_dim=neurons))
model.add(Dropout(.2))
model.add(Dense(input_dim=neurons, output_dim=ahead))
model.add(Activation('linear'))
model.compile(loss='mae', optimizer='adam')
model.fit(X, y, epochs=50)

This model works fine. Now, I'd like to predict the values of B as well (using the same input). So I tried to reshape the output in a similar way as I did for the training that has multiple features:

 y = [[[B3, C3],[B4, C4],[B5, C5]],[[ ...]]]

so that it has dimensions: (1000, 3, 2). However, this gives me an error:

Error when checking target: expected activation_5 to have 2 dimensions, 
but got array with shape (1000, 3, 2)

I guess the structure of the network needs to change. I tried to modify model.add(Dense(input_dim=neurons, output_dim=ahead)) with no success. Should I reshape the y differently? Is the structure of the network wrong?

NCL
  • 355
  • 2
  • 4
  • 12
  • I think the idea is correct but the code is wrong. An LSTM layer takes a 3D tensor as input and returns a 2D tensor, as stated [here](https://keras.io/layers/recurrent). To make it return a 3D tensor you need to set `return_sequence=True`. – gionni Dec 01 '17 at 17:24
  • By setting output_dim as 3 on the final Dense layer, aren't you essentially asking it to predict the output as a single time step with 3 different features instead of 3 separate time steps with a single feature? – Chris Wang Aug 12 '18 at 14:10

0 Answers0