using the function classification_report
from scikit-learn you get information about important metrics like Accuracy, Precission, Recall, F1 Score (micro and macro) of a Model. The function receives as input arguments the vector with the true labels and a vector with the predicted labels to calculate the metrics. Is there a function which calculates the identical metrics as classification_report
when an already existing confusion matrix is used as input argument like:
[[3955 62 610]
[ 319 2982 117]
[ 584 52 1439]]
For certain reasons, the vectors with the true and predicted labels can no longer be traced. The possible function is needed for an automated calculation of the metrics of several confusion matrices.