2

I'm implementing a custom Layer with the Keras API (working with TF2.0-beta). I want to use the epoch number in my calculation in order to decay a parameter over time (meaning - in the call() method).

I'm used to tf.get_global_step() but understand that TF deprecated all global scopes, and definitely for a good reason.

If I had the model instance, I could use model.optimizer.iterations, but I'm not sure how I get the instance of my parent model when I'm implementing a Layer.

Do I have any way to do that or the only way is to let the layer expose a Callback that will update the parameter I want to decay? Other ideas? Ideally something that wouldn't make the user of the layer aware of that inner detail (that's why I don't like the Callback approach - user has to add them to the model).

Zach Moshe
  • 2,782
  • 4
  • 24
  • 40
  • Have you solved this issue? If yes, how? I asked the same question here: https://stackoverflow.com/q/61329343/3924118. – nbro Apr 20 '20 at 18:56
  • As far as I understand this - if you write a general Layer (which can be used by others), you shouldn't rely on the "step" as every optimizer has a different notion of it and your layer could end up being used by many of them. If you're the only one using your layer, I guess you could add a variable and increment it with a Keras callback or a similar trick. – Zach Moshe Apr 24 '20 at 19:11

0 Answers0