Is there any sort of communication/sharing happening between units inside a RNN/LSTM layer ?
The below figure is cropped from accepted answer How to interpret clearly the meaning of the units parameter in Keras? Here the author connects various units in the RNN/LSTM layer (marked in red).
I am aware that, in RNN/LSTM parameter sharing happens across timesteps, but does it happen between units?
EX code
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
from keras.layers import Embedding
from keras.layers import LSTM
model = Sequential()
model.add(LSTM(4, input_dim=3))
model.summary()
I came across this lecture where professor makes it very clear that, there is no communication/sharing/something else between (units in Keras RNN/LSTM) in the same layer.
https://www.youtube.com/watch?v=7nnSjZBJVDs&list=PLQflnv_s49v_i1OVqE0DENBk-QJt9THjE&index=10&t=396s
I think The communication/sharing doesnt happen between cells in the same layer. Can someone please clarify/think otherwise ?