1

I have my custom loss class and a callback to update weight which I got from here, here. The second link is kind of a little bit not quite my scenario because we need to access loss history and accuracy in order to update weight, so I think callback from the first link is the best way to do that.

Here is the code I got

class AdaptiveLossCallback(tf.keras.callbacks.Callback):
    def __init__(self):
        super(AdaptiveLossCallback, self).__init__()
        self.weight1 = tf.Variable(1.0, trainable=False, name='weight1', dtype=tf.float32)
        self.weight2 = tf.Variable(0.0, trainable=False, name='weight2', dtype=tf.float32)

    def on_epoch_end(self, epoch, logs=None):
        if epoch == 49:
            self.weight1 = tf.assign(self.weight1 , tf.constant(0.5))
            self.weight2 = tf.assign(self.weight2 , tf.constant(0.5))
        elif epoch == 74:
            self.weight1 = tf.assign(self.weight1 , tf.constant(0.0))
            self.weight2 = tf.assign(self.weight2 , tf.constant(1.0))


class CustomLoss(tf.keras.losses.Loss):
    def __init__(self,
                 adaptive_loss=None,
                 from_logits=False,
                 reduction=losses_utils.ReductionV2.AUTO,
                 name=None):
        super(CustomLoss, self).__init__(reduction=reduction)
        self.from_logits = from_logits
        self.adaptive_loss = adaptive_loss

    def call(self, y_true, y_pred):
        ...
        weight1 = self.adaptive_loss.weight1
        weight2 = self.adaptive_loss.weight2
        return weight1 * loss1 + weight2 * loss2

But I can't seem to make it work. When running this I will say

Attempting to use uninitialized value weight1

After I try this

session = tf.keras.backend.get_session()
session.run(tf.global_variables_initializer())
model.fit(...)

It seems to work but the weight value is not updating at all.

What I am doing wrong and how can I fix this? Is there a better way to add a changeable variable to Keras model?

Thanks

PS. I can't use Keras model loss_weights because I have only one output

SaintTail
  • 6,160
  • 4
  • 31
  • 51
  • Do the weights need to be tensors? Can't they just be floats? – rvinas Jul 27 '19 at 17:15
  • Because I need to update the weight during training like on some epoch. I try float before with eager execution but the `def call(self, y_true, y_pred)` is only called once during model compile so I need to send it as a tensor. – SaintTail Jul 27 '19 at 17:25
  • 1
    Alright, thanks, now I get it. Could you try using `K.set_value` instead of `tf.assign`? I believe that with `tf.assign` the tensor references in the loss function are not updated, since the weights are not modified in place. – rvinas Jul 27 '19 at 18:37
  • 1
    @rvinas Thanks that is working!. So I need to call session.run() that assign op. – SaintTail Jul 28 '19 at 02:57
  • I'm glad it works. I'll wrap it up in an answer – rvinas Jul 28 '19 at 07:40

1 Answers1

1

The problem is that the weight references in the loss function are not being updated with just tf.assign. To appropriately update the loss coefficients, you could do the following:

a) K.set_value(self.weightX, update_value)

or

b) sess.run(self.weightX.assign(update_tensor))

rvinas
  • 11,824
  • 36
  • 58