1

Recently,I try to use the “tf.contrib.rnn.LayerNormBasicLSTMCell” , but I don't know what's the mean of the argument “dropout_keep_prob”.

Then I look at the Document given by Google. Their explanation is “unit Tensor or float between 0 and 1 representing the recurrent dropout probability value. If float and 1.0, no dropout will be applied.”

But I don't know the difference between “recurrent dropout” and“dropout”.

Innat
  • 16,113
  • 6
  • 53
  • 101
C y
  • 11
  • 1
  • Does this answer your question? [Keras: the difference between LSTM dropout and LSTM recurrent dropout](https://stackoverflow.com/questions/44924690/keras-the-difference-between-lstm-dropout-and-lstm-recurrent-dropout) – Innat May 08 '21 at 10:05

1 Answers1

0

Recurrent Dropout is a regularization method for recurrent neural networks. Dropout is applied to the updates to LSTM memory cells, i.e. it drops out the input/update gate in LSTM. For more information you can refer here.