Previously in another post (Keras multioutput custom loss with intermediate layers output) I discussed the problem I was having. Finally, this problem was fixed in this way:
def MyLoss(true1, true2, out1, out2, out3):
loss1 = tf.keras.losses.someloss1(out1, true1)
loss2 = tf.keras.losses.someloss2(out2, true2)
loss3 = tf.keras.losses.someloss3(out2, out3)
loss = loss1 + loss2 + loss3
return loss
input1 = Input(shape=input1_shape)
input2 = Input(shape=input2_shape)
# do not take into account the notation, only the idea
output1 = Submodel1()([input1,input2])
output2 = Submodel2()(output1)
output3 = Sumbodel3()(output1)
true1 = Input(shape=true1shape)
true2 = Input(shape=true2shape)
model = Model([input1,input2,true1,true2], [output1,output2,output3])
model.add_loss(MyLoss(true1, true2, output1, output2, output3))
model.compile(optimizer='adam', loss=None)
model.fit(x=[input1 ,input2 ,true1,true2], y=None, epochs=n_epochs)
In that problem, all the losses I used were keras losses (i.e. tf.keras.losss.someloss
) but now I want to add a couple more losses and I want to combine custom losses with keras losses. That is, now I have this scheme:
For adding these two losses, which are SSIM losses, I have tried this:
def SSIMLoss(y_true, y_pred):
return 1-tf.reduce_mean(tf.image.ssim(y_true, y_pred, 1.0))
def MyLoss(true1, true2, out1, out2, out3):
loss1 = tf.keras.losses.someloss1(out1, true1)
customloss1 = SSIMLoss(out1,true1)
loss2 = tf.keras.losses.someloss2(out2, true2)
loss3 = tf.keras.losses.someloss3(out2, out3)
customloss2 = SSIMLoss(out2,out3)
loss = loss1 + loss2 + loss3 + customloss1 + customloss2
return loss
But I get this error:
OperatorNotAllowedInGraphError: using a `tf.Tensor` as a Python `bool` is not allowed in Graph execution. Use Eager execution or decorate this function with @tf.function.
I have tried decorating the function with @tf.function
but I get this error:
_SymbolicException: Inputs to eager execution function cannot be Keras symbolic tensors, but found [<tf.Tensor 'input_43:0' shape=(None, 128, 128, 1) dtype=float32>, <tf.Tensor 'conv2d_109/Sigmoid:0' shape=(None, 128, 128, 1) dtype=float32>]
I have found this (https://github.com/tensorflow/tensorflow/issues/32127) about combining keras losses with add_loss
, maybe this is the problem, but I don´t know how to fix it.