5

I wanted to add custom loss for Deeplab v3 that would work for NOT one hot encoding labels but for saliency prediction. So instead of Deeplab loss implementation that you see below:

label = tf.to_int32(label > 0.2)
one_hot_labels = slim.one_hot_encoding(label, num_classes, on_value=1.0, off_value=0.0)
tf.losses.softmax_cross_entropy(one_hot_labels, logits)

I used this implementation:

softmax = tf.log(tf.nn.softmax(logits))
cross_entropy = -tf.reduce_sum(label*softmax, reduction_indices=[1])
tf.losses.add_loss(tf.reduce_mean(cross_entropy))

Trained ~1000 epochs with 5 images and got this result:

Also, tried several learning rates, but it doesn't change the result for custom loss.

Eva
  • 83
  • 5
  • When you said the result for custom loss is unchanged, what do you mean? The image or some log you are tracking? – prosti Dec 12 '18 at 02:16
  • The predicted labels are still messy even I change learning rate with custom loss. The loss value that Deeplab outputs to the console is slowly (too slowly) decreasing until certain point no matter what learning rate I choose (with the custom loss). I guess that I do not see the difference because of only 3 or 4'th number after the comma changes. – Eva Dec 13 '18 at 08:34

0 Answers0