1

There is a cross-entropy in one of the answers here: nolearn for multi-label classification, i.e.:

# custom loss: multi label cross entropy
def multilabel_objective(predictions, targets):
    epsilon = np.float32(1.0e-6)
    one = np.float32(1.0)
    pred = T.clip(predictions, epsilon, one - epsilon)
    return -T.sum(targets * T.log(pred) + (one - targets) * T.log(one - pred), axis=1)

Why is this specifically multi-label? It looks a lot like log-loss for univariate (single-class) classification. I found this in the literature, http://arxiv.org/pdf/1312.5419v3.pdf.

Community
  • 1
  • 1
  • It considers for all the labels but individually computes the cross entropy for each labels and sums over them. Eq. 6 in Nam et al., 2014, paper you linked. – phoxis Feb 17 '23 at 16:06

0 Answers0