1

I built multiclass classification model (with 5 classes in target) in Python and I have confusion matrix like below:

confusion_matrix(y_test, model.predict(X_test))

[[2006 114 80 312 257]
 [567  197 87 102 155]
 [256  84  316 39 380]
 [565  30  67 592 546]
 [363  71  186 301 1402]]

How can I calculate based on confusion matrix above, the following values:

  1. True Negative
  2. False Positive
  3. False Negative
  4. True Positive
  5. Accuracy
  6. True Positive Rate
  7. False Positive Rate
  8. True Negative Rate
  9. False Negative Rate

I have the following function to calculate that for binnary target, but how can I modify that function to calculate that for my 5 classes target ?

def xx(model, X_test, y_test):
    CM = confusion_matrix(y_test, model.predict(X_test))
    print(CM)

    print("-"*40)

    TN = CM[0][0]
    FP = CM[0][1]
    FN = CM[1][0]
    TP = CM[1][1]
    sensitivity=TP/float(TP+FN)
    specificity=TN/float(TN+FP)

    print("True Negative:", TN)
    print("False Positive:", FP)
    print("False Negative:", FN)
    print("True Positive:", TP)
    print("Accuracy", round((TN + TP) / len(model.predict(X_test)) * 100, 2), "%")
    print("True Positive rate",round(TP/(TP+FN)*100,2), "%")
    print("False Positive rate",round(FP/(FP+TN)*100,2), "%")
    print("True Negative rate",round(TN/(FP+TN)*100,2), "%")
    print("False Negative rate",round(FN/(FN+TP)*100,2), "%")
dingaro
  • 2,156
  • 9
  • 29

1 Answers1

1

You have to compute N confusion matrices (N is the number of classes) to have binary matrix (One class VS. Others):

def confusion_matrix_for(cls, cm):
    TP = cm[cls, cls]
    FN = cm[cls].sum() - TP
    FP = cm[:, cls].sum() - TP
    TN = cm.sum() - TP - FN - FP
    return np.array([[TP, FN], [FP, TN]])

Usage:

# Confusion matrix for class 0
>>> confusion_matrix_for(0, CM)
array([[2006,  763],    # TP | FN
       [1751, 4555]])   # FP | TN

# Flatten confusion matrix for class 0
>>> confusion_matrix_for(0, CM).ravel()
array([2006,  763, 1751, 4555])  # TP, FN, FP, TN

Use a loop and your code to compute your metrics for each class.

Update

CM = confusion_matrix(y_test, model.predict(X_test))

for cls in range(CM.shape[0]):
    print(f'[Class {cls} vs others]')
    TP, FN, FP, TN = confusion_matrix_for(cls, CM).ravel()
    print(f'TP: {TP}, FN: {FN}, FP: {FP}, TN: {TN}')
    # compute your metrics (your code in the question)
    print()

# Output
[Class 0 vs others]
TP: 2006, FN: 763, FP: 1751, TN: 4555

[Class 1 vs others]
TP: 197, FN: 911, FP: 299, TN: 7668

[Class 2 vs others]
TP: 316, FN: 759, FP: 420, TN: 7580

[Class 3 vs others]
TP: 592, FN: 1208, FP: 754, TN: 6521

[Class 4 vs others]
TP: 1402, FN: 921, FP: 1338, TN: 5414
Corralien
  • 109,409
  • 8
  • 28
  • 52
  • Corralien, in your function def confusion_matrix_for(cls, cm): "cls" is my model it is right, but what is "cm", which value has to be in this parameters in your function ? Could you tell me that, please ? – dingaro Feb 17 '23 at 07:39
  • Sorry for the confusion :-) The arguments are "cls": the class number (0 to 4 here, you have 5 classes). "cm" is your "CM" confusion matrix. – Corralien Feb 17 '23 at 07:41
  • 1
    For one class (0 for the example), the function returns the binary confusion matrix from your full confusion matrix. Is it clear? – Corralien Feb 17 '23 at 07:42
  • It looks clear, I will test it, give me a moment 10-15 minutes, and I will be back to inform you and give best answer :) – dingaro Feb 17 '23 at 07:51
  • everything is right, but one question - my 5 classes target is not numerical, it is categorical target like "A", "B", "C", "D", "E", but in your output I have 0,1,2,3,4 - how can I have nominal values from my target in your output Corralien, could you only this one tell me please? :) – dingaro Feb 17 '23 at 08:01
  • 1
    How your labels are stored? (CM is a numpy array so the indexing is numeric) – Corralien Feb 17 '23 at 08:11
  • Corralien, ok it can stay as numeric in output, but how can I recognize which class (A/B/C/D/E) is 0/1/2/3/4 ? – dingaro Feb 17 '23 at 08:12
  • 1
    Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/251944/discussion-between-corralien-and-dingaro). – Corralien Feb 17 '23 at 08:14
  • ok, we can :))) – dingaro Feb 17 '23 at 08:17