0

Is there any sort of communication/sharing happening between units inside a RNN/LSTM layer ?

The below figure is cropped from accepted answer How to interpret clearly the meaning of the units parameter in Keras? Here the author connects various units in the RNN/LSTM layer (marked in red).

I am aware that, in RNN/LSTM parameter sharing happens across timesteps, but does it happen between units?

EX code

from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
from keras.layers import Embedding
from keras.layers import LSTM
model = Sequential()
model.add(LSTM(4, input_dim=3))
model.summary()

enter image description here

I came across this lecture where professor makes it very clear that, there is no communication/sharing/something else between (units in Keras RNN/LSTM) in the same layer.

https://www.youtube.com/watch?v=7nnSjZBJVDs&list=PLQflnv_s49v_i1OVqE0DENBk-QJt9THjE&index=10&t=396s

I think The communication/sharing doesnt happen between cells in the same layer. Can someone please clarify/think otherwise ?

v09
  • 840
  • 2
  • 12
  • 22

1 Answers1

1

I think The communication/sharing doesnt happen between cells in the same layer. Can someone please clarify/think otherwise ?

There are no communication between units in a standard LSTM layer (at least in every standard implementation I know of). The same way you don't have any connection between unit in a Dense layer.

I'm not saying this is not feasible or bad, but this sure does not describe a standard RNN/LSTM unit.

Yoan B. M.Sc
  • 1,485
  • 5
  • 18