7

Background

I have a multi-label classification problem with 5 labels (e.g. [1 0 1 1 0]). Therefore, I want my model to improve at metrics such as fixed recall, precision-recall AUC or ROC AUC.

It doesn't make sense to use a loss function (e.g. binary_crossentropy) that is not directly related to the performance measurement I want to optimize. Therefore, I want to use TensorFlow's global_objectives.recall_at_precision_loss() or similar as loss function.

Not metric

I'm not looking for implementing a tf.metrics. I already succeeded in that following: https://stackoverflow.com/a/50566908/3399066

Problem

I think my issue can be divided into 2 problems:

  1. How to use global_objectives.recall_at_precision_loss() or similar?
  2. How to use it in a Keras model with TF backend?

Problem 1

There is a file called loss_layers_example.py on the global objectives GitHub page (same as above). However, since I don't have much experience with TF, I don't really understand how to use it. Also, Googling for TensorFlow recall_at_precision_loss example or TensorFlow Global objectives example won't give me any clearer example.

How do I use global_objectives.recall_at_precision_loss() in a simple TF example?

Problem 2

Would something like (in Keras): model.compile(loss = ??.recall_at_precision_loss, ...) be enough? My feeling tells me it is more complex than that, due to the use of global variables used in loss_layers_example.py.

How to use loss functions similar to global_objectives.recall_at_precision_loss() in Keras?

NumesSanguis
  • 5,832
  • 6
  • 41
  • 76
  • Hi! I am also trying to integrate the loss functions from Global Objectives into Keras. Did you find a solution to your problem? – Pierre Lison Apr 11 '19 at 08:39
  • I don't think I found the solution, but with TensorFlow 2.0 integrating Keras, it might be easier? – NumesSanguis Apr 16 '19 at 02:42
  • The original link is now broken because TF is not maintaining it any more. [New link pointing to archive location](https://github.com/tensorflow/models/tree/archive/research/global_objectives) – J Trana Sep 25 '20 at 02:32

2 Answers2

4

Similar to Martino's answer, but will infer shape from input (setting it to a fixed batch size did not work for me).

The outside function isn't strictly necessary, but it feels a bit more natural to pass params as you configure the loss function, especially when your wrapper is defined in an external module.

import keras.backend as K
from global_objectives.loss_layers import precision_at_recall_loss

def get_precision_at_recall_loss(target_recall): 
    def precision_at_recall_loss_wrapper(y_true, y_pred):
        y_true = K.reshape(y_true, (-1, 1)) 
        y_pred = K.reshape(y_pred, (-1, 1))   
        return precision_at_recall_loss(y_true, y_pred, target_recall)[0]
    return precision_at_recall_loss_wrapper

Then, when compiling the model:

TARGET_RECALL = 0.9
model.compile(optimizer='adam', loss=get_precision_at_recall_loss(TARGET_RECALL))
mes
  • 41
  • 2
  • Just in case someone else runs into this. This does not work with 1.14.x 1.15.x if you import Keras via `tensorflow.keras`. If you `pip3 install keras=2.2.5` and import 'keras' (not `tensorflow.keras`) it works ! – stringCode Jul 18 '20 at 07:33
3

I managed to make it work by:

  • Explicitly reshaping tensors to BATCH_SIZE length (see code below)
  • Cutting the dataset size to a multiple of BATCH_SIZE
    def precision_recall_auc_loss(y_true, y_pred):
        y_true = keras.backend.reshape(y_true, (BATCH_SIZE, 1)) 
        y_pred = keras.backend.reshape(y_pred, (BATCH_SIZE, 1))   
        util.get_num_labels = lambda labels : 1
        return loss_layers.precision_recall_auc_loss(y_true, y_pred)[0]
  • 2
    Thank you for your answer. Could provide a bit more code, so that people can execute a minimal example? From where are you calling your function `precision_recall_auc_loss()`? From which library is `util`? Are you using Keras with TensorFlow 2.0? – NumesSanguis Jul 21 '19 at 15:08
  • 1
    I tried using this today with TF2.x. Unfortunately, the loss_layers.precision_recall_auc_loss fails due to TF1.x usage patterns. Specifically, it's not clear to me how one should proceed regarding the use of model_variable after doing an automated upgrade using tf_upgrade_v2. However, the upgraded loss_layers.roc_auc_loss did work using this exact same pattern out of the box and provided improved results for my use case! – J Trana Jan 18 '20 at 06:45
  • 1
    Just in case someone else runs into this. This does not work with 1.14.x 1.15.x if you import Keras via `tensorflow.keras`. If you `pip3 install keras=2.2.5` and import 'keras' (not `tensorflow.keras`) it works ! (havent tried with tf2) – stringCode Jul 18 '20 at 07:33