0

i'm studying LSTM model.

Does one memory cell of hidden layer in LSTM correspond to one timestep?

example code) model.add(LSTM(128, input_shape = (4, 1)))

When implementing LSTMs in Keras, can set the number of memory cells, as in the example code, regardless of the time step. In the example it is 128.

but, A typical LSTM image is shown to correspond 1: 1 with the number of time steps and the number of memory cells. What is the correct answer?

enter image description here

phyzik
  • 1

2 Answers2

1

as I understand timestep is a length of Sequence per each processing (=Window_Size)... that (dependently on parameter "return_sequences=True/False") will return either multi- or single- output per each step of data processed... like here explained & showed ...

explanation here seems to be better

concerning memory cell - here "A part of a NN that preserves some state across time steps is called a memory cell." - make me consider memory cell to be, probably, a "container" - each for temporal weights per vars in window series, till update of them during further backpropagation (when statefull=True) --

BETTER TO SEE ONCE - pic here memory cell & the logics of its work here

KNOW usage of the whole shape - here - time_steps for backpropagation

JeeyCi
  • 354
  • 2
  • 9
0

In LSTM, we supply input in the following manner [samples,timesteps,features] samples is for number of training examples you want to feed at a time timesteps is how many values you want to use Say you mention timesteps=3 So values at t,t-1 and t-2 are used to predict the data at t+1 features is how many dimensions you want to supply at a time LSTM has memory cells but I am explaining the code part so as not to confuse you I hope this helps

Raj
  • 86
  • 5