13

I'm doing a binary classification using Keras (with Tensorflow backend) and I've got about 76% precision and 70% recall. Now I want to try to play with decision threshold. As far as I know Keras uses decision threshold 0.5. Is there a way in Keras to use custom threshold for decision precision and recall?

Thank you for your time!

Nassim Ben
  • 11,473
  • 1
  • 34
  • 52
nabroyan
  • 3,225
  • 6
  • 37
  • 56
  • For completion's sake: Keras now natively includes a threshold(s) parameter for metrics based on True/False Positives/Negatives. – JS Lavertu Nov 08 '21 at 16:13

1 Answers1

29

create custom metrics like this :

Edited thanks to @Marcin : Create functions that returns the desired metrics with threshold_value as argument

def precision_threshold(threshold=0.5):
    def precision(y_true, y_pred):
        """Precision metric.
        Computes the precision over the whole batch using threshold_value.
        """
        threshold_value = threshold
        # Adaptation of the "round()" used before to get the predictions. Clipping to make sure that the predicted raw values are between 0 and 1.
        y_pred = K.cast(K.greater(K.clip(y_pred, 0, 1), threshold_value), K.floatx())
        # Compute the number of true positives. Rounding in prevention to make sure we have an integer.
        true_positives = K.round(K.sum(K.clip(y_true * y_pred, 0, 1)))
        # count the predicted positives
        predicted_positives = K.sum(y_pred)
        # Get the precision ratio
        precision_ratio = true_positives / (predicted_positives + K.epsilon())
        return precision_ratio
    return precision

def recall_threshold(threshold = 0.5):
    def recall(y_true, y_pred):
        """Recall metric.
        Computes the recall over the whole batch using threshold_value.
        """
        threshold_value = threshold
        # Adaptation of the "round()" used before to get the predictions. Clipping to make sure that the predicted raw values are between 0 and 1.
        y_pred = K.cast(K.greater(K.clip(y_pred, 0, 1), threshold_value), K.floatx())
        # Compute the number of true positives. Rounding in prevention to make sure we have an integer.
        true_positives = K.round(K.sum(K.clip(y_true * y_pred, 0, 1)))
        # Compute the number of positive targets.
        possible_positives = K.sum(K.clip(y_true, 0, 1))
        recall_ratio = true_positives / (possible_positives + K.epsilon())
        return recall_ratio
    return recall

now you can use them in

model.compile(..., metrics = [precision_threshold(0.1), precision_threshold(0.2),precision_threshold(0.8), recall_threshold(0.2,...)])

I hope this helps :)

Nassim Ben
  • 11,473
  • 1
  • 34
  • 52
  • @NassimBen nice solution. I would like to do something quite similar but dynmaically caclulate the `threshold_value` based on the `kth` largest value in `y_pred`: I've asked the question here: https://stackoverflow.com/questions/45720458/keras-custom-recall-metric-based-on-predicted-values – notconfusing Aug 16 '17 at 18:18
  • if I give it different threshold values and save the model at what precision or recall value the model will the model will be saved ? – Mohsin Nov 08 '17 at 06:21
  • here is [another method] (https://stackoverflow.com/questions/52041931/is-there-an-optimizer-in-keras-based-on-precision-or-recall-instead-of-loss), I don't know why these two code produce different results for the same threshold, and they are all different from the value I count with the predict result (while keras_metrics.precision() returns the correct answer for 0.5 threshold). – yang Mar 20 '19 at 04:14
  • @Mohsin I believe the model won't save precision or recall value at all. They are for evaluation only. After a training, you may save weights, these weights are aim to reducing loss value. – yang Mar 20 '19 at 04:18