0

I'm trying to figure out how to build a model using LSTM/GRU that predicts many to many but for every nth (7 in my case) input. For example, my input data has timesteps per day for a whole year but I'm only trying to predict the output at the end of each week and not each day.

The only information I was able to find is this answer: Many to one and many to many LSTM examples in Keras

It says: "Many-to-many when number of steps differ from input/output length: this is freaky hard in Keras. There are no easy code snippets to code that."

In pytorch it seems like you can set the ignore_index in the loss function which I think should do the trick.

Is there a solution for keras?

Eyal S.
  • 1,141
  • 4
  • 17
  • 29

1 Answers1

0

I think I found the answer. Since I'm trying to predict every nth value we can just keep the output from the LSTM layer that we are trying to predict and get rid of the rest. I created a lambda layer to do that - it just reads every 7th value from the lstm output. This is the code:

X = np.random.normal(0,1,size=(100,365,5))
y = np.random.randint(2,size=(100,52,1))
model = Sequential()
model.add(LSTM(1, input_shape=(365, 5), return_sequences=True))
model.add(Lambda(lambda x: x[:, 6::7, :]))
model.add(TimeDistributed(Dense(1,activation='sigmoid')))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(X,y,epochs=3,verbose=1)
Eyal S.
  • 1,141
  • 4
  • 17
  • 29