2

Some layers have regularisation losses. I would like to have a "metric" that only keeps track of the regularisation losses, so that I can separately (from the total loss) see, in the progress bar, how the regularisation losses are evolving during training. Furthermore, given that metrics are saved in the history object (returned by fit), I could easily plot and save (to a file) the regularisation losses at the end of the training.

How can I write a custom metric only for the regularisation losses?

I know I can have a callback that iterates the layers of a model and sums the regularisation losses (see e.g. https://stackoverflow.com/a/48489090/3924118), but I would like to have a metric (rather than a callback) because the metric will be saved in the history object.

Ultimately, I would like the regularisation losses to be present in the history object and displayed in the progress bar (separately from the total loss), even without implementing a custom metric, but I don't know if there's such an option in tf.keras. Also, it would be nice if I could have two options: 1. see the sum of all regularisation losses across all layers and 2. see the regularisation losses for each layer separately.

nbro
  • 15,395
  • 32
  • 113
  • 196
  • How do you add regularization in your model? Using `kernel_regularizer` argument of layers, or using `add_loss` method in a custom layer/model? – today Apr 25 '20 at 19:17
  • @today In my specific case, I am using `DenseFlipout` from TFP. They add the losses to the field `losses` with `add_loss`, but I don't think that makes any difference. – nbro Apr 25 '20 at 19:26
  • I see. And is this your only reason for not using callback: "because the metric will be saved in the history object."? In other words, if it's stored in history object you are fine with it, no matter it's a metric or a callback, right? – today Apr 25 '20 at 19:50
  • @today Well, yes, it would be nice to have access to the regularisation losses from the history object. Right now, I am trying to do this: https://stackoverflow.com/a/48489090/3924118, but if I could add the regularisations to the history, my code would be more unified (because I am already plotting at the end of training the evolution of the other losses by accessing the history object). – nbro Apr 25 '20 at 19:54

2 Answers2

1

I think I found a solution. You can modify the parameter logs in a callback to add the information that you want. Then your history object will automatically contain this new information.

def get_cb(model):
    def on_epoch_end(epoch, logs):
        # Iterate the layers of your model to get the regularisation losses
        reg_losses = get_your_regularisation_losses(model)
        logs["reg_losses"] = reg_losses

    return tf.keras.callbacks.LambdaCallback(on_epoch_end=on_epoch_end)

...
history = model.fit(..., callbacks=[get_cb(model)])
do_something(history) # history contains "reg_losses"

(Feel free to provide alternative solutions that make use of custom metrics!)

nbro
  • 15,395
  • 32
  • 113
  • 196
1

To get it in the progress bar you can do something like this:

def get_reg_loss(reg_layers):
    def reg_loss_term(y_true, y_pred):
        return tf.add_n([r.losses[0] for r in reg_layers])

    return reg_loss_term

model.compile(..., metrics=[..., get_reg_loss(reg_layers)])

This adds the sum of all regularization loss terms as a metric to the model. Downside of this approach is that when you want the progress bar to show the loss terms of individual layers, you would have to define a uniquely named function for each of the regularized layers.

Bas Krahmer
  • 489
  • 5
  • 11