1

I'm working on interpreting my XGB-Classifier which predicts the three classes "low", "medium", and "high". I saw that there are some approaches and python libraries (such as SHAP-Explainer) to interpret machine learning models. However, during my research I only found SHAP-applications on regression, or classifcation problems which output probabilities (but my model outputs 3 classes). Can I use SHAP on my problem?

My current approach is:

shap_values = explainer.shap_values(X)
shap.summary_plot(shap_values[classindex], X.values, feature_names = X.columns, show = False)

Classindex controls the 3 classes of the models and I'm filling it with 0, 1, and 2 in order to plot the summary plot for each of my classes.

desertnaut
  • 57,590
  • 26
  • 140
  • 166
vegetasama
  • 29
  • 3
  • I don’t know about SHAP but if its output are probabilities then you can just classify them as high-medium-low with ranges – Fravadona Nov 21 '21 at 14:51
  • You may find interesting [this](https://stackoverflow.com/questions/65029216/how-to-interpret-base-value-of-multi-class-classification-problem-when-using-sha/65034362#65034362) – Sergey Bushmanov Nov 21 '21 at 20:13
  • You may also find [this](https://towardsdatascience.com/explainable-ai-xai-with-shap-multi-class-classification-problem-64dd30f97cea) interesting. It is a write up of how to use shap with multi-class. [Here is a link](https://github.com/Iditc/Posts-on-Medium/blob/main/Explainable%20AI/Explainable%20AI%20(XAI)%20with%20SHAP_MultiClass%20Classification%20Problem.ipynb) to the authors code. – pwb2103 Aug 14 '22 at 03:30

0 Answers0