I am working on a custom Recurrent
layer in Keras, and I need to reset a specific weight after each sequence.
For example:
- The entry has shape
(nb_sequences, sequence_size, value_size)
, - My network has 2 sets of weights, say
self.A
andself.B
self.A
is trainable,self.B
is not
The output is computed with both self.A
and self.B
I would like my model to have a clean reseted self.B
at the begining of each sequence while still training self.A
like all model.
In this model, self.A
acts as the controler, and self.B
acts like a readable/writable memory. So during the sequence, self.A
will write and read in self.B
. But I want the memory to be empty at the begining of each sequence
I saw that you can reset a whole model with save_weights
and load_weights
like presented in This Question, and I think I can adapt this to reset a specific weight in a layer, but the hard point is to do this at the end of each sequence.
This Keras Documentation explains how to do specific actions at the begining/end of each Train
, Epoch
or Batch
but I can't find how to do things at the begining of each sequence...
I also thought of using the states
variables sent at each step
to reset self.B
at the begining of each sequence but I can't figure how to use this...
Any ideas ?