0

I have a three output network with three defined custom loss functions and during training, Keras returns three loss values as I would expect but also an additional value which I suspect is a combined loss. How is it defined or what does it represent? I didn't find anything in the documentation, clarification is appreciated.

Also if it really is combined loss, does it just serve as an indicator or does it affect gradients in any way?

implementation example

losses = [my_loss(config1), my_loss(config2), my_loss(config3)]

model.compile(optimizer=optimizer, loss=losses, run_eagerly=False)
model.fit(...)  # training returns 4 loss values - 'loss', 'my_loss1', 'my_loss2' and 'my_loss3'

EDIT:

Example losses training curves. It's clear that sum of my losses is not the combined loss. And I do not use any weights in compile method.

Losses at epochs

Dominik Ficek
  • 544
  • 5
  • 18
  • is this what you are looking for: https://stackoverflow.com/a/65475046/10375049 – Marco Cerliani Feb 03 '21 at 23:53
  • yes thanks for answer, this look like what im looking for but my losses don't seem to add together and I don't use any weights. e.x. my losses are 3, 5 and 7 and loss is 25. Is this something in my code? – Dominik Ficek Feb 03 '21 at 23:59
  • The `loss` is the weighted sum of the individual losses provided for various outputs of the model. If no `class_weights` are provided, the `loss` is simply the sum of `my_loss_1`, `my_loss_2` AND `my_loss_3`. This [answer](https://stackoverflow.com/questions/49404309/how-does-keras-handle-multiple-losses) might be helpful. – Shubham Panchal Feb 04 '21 at 01:49

0 Answers0