Questions tagged [lime]

LIME (local interpretable model-agnostic explanations) is an explainability method used to inspect machine learning models. Include a tag for Python, R, etc. depending on the LIME implementation.

LIME (local interpretable model-agnostic explanations) is an explainability method used to inspect machine learning models and debug their predictions. It was originally proposed in “Why Should I Trust You?”: Explaining the Predictions of Any Classifier (Ribeiro et al., NAACL 2016) as a way to explain how models were making predictions in natural language processing tasks. Since then, people implemented the approach in several packages, and the technique inspired later techniques for "explainable machine learning," such as the SHAP approach.

Related Concepts

LIME Implementations

Implementations of this approach exist in several software packages.

Python

R

Further Reading

101 questions
6
votes
2 answers

Explaining CNN (Keras) outputs with LIME

I am trying to explain the outputs of my convolutional neural network bult in Keras with LIME. My neural network is a multi-class text classifier where every class is independent. Thus, a text can contain class 1 and 2 or only 1 etc. A fifth "class"…
junkmaster
  • 141
  • 1
  • 11
6
votes
1 answer

Lime in R: example is not working Error in glmnet(x[shuffle_order, features]

With the Lime package in R, I want to explain my RF model. I saw an example on github (scroll a bit down to the section "An Example"): (https://github.com/thomasp85/lime) I tried to run the exact same code, but with my own data (my data is added…
R overflow
  • 1,292
  • 2
  • 17
  • 37
4
votes
0 answers

Can the R version of lime explain xgboost models with count:poisson objective function?

I generated a model using xgb.train with the "count:poisson" objective function and I get the following error when trying to create the explainer: Error: Unsupported model type Lime works when I replace the objective by something else such as…
Zoltan
  • 760
  • 4
  • 15
4
votes
1 answer

Lime: Error in glmnet(x[, c(features, j), x should be a matrix with 2 or more columns

I am following this example to use lime on a supervised text model https://rdrr.io/github/thomasp85/lime/man/lime.html I have just changed the get_matrix function to create the dtm. This new function works on the data in the example in this link,…
Elly
  • 129
  • 2
  • 12
3
votes
0 answers

LIME Image classification interpretation for multi-input DNN

I am fairly new to Deep Learning, but I managed to build a multi-branch Image Classification architecture yielding quite satisfactory results. Not so important: I am working on KKBox customer churn…
3
votes
1 answer

Using LIME for predictions of a logit model in R?

So I am trying to use LIME to understand predictions from a logit model in R. I know I don't 'need' to, but I am trying to illustrate what it does with a model that one can simply understand as a starting point for a presentation. But I am having…
J Econ
  • 51
  • 1
2
votes
0 answers

Way to show not normalized data in LIME

Is there any way to show the true/natural value in lime package after rescaling the data? I have used rescaling for the logistic regression algorithm, now when I want to display the prediction, it is showing scaled values. I could not use non-scaled…
2
votes
1 answer

Applying LIME interpretation on my fine-tuned BERT for sequence classification model?

I fine tuned BERT For Sequence Classification on task specific, I wand to apply LIME interpretation to see how each token contribute to be classified to specific label as LIME handle the classifier as black box. I made a combined code from available…
Eliza William
  • 53
  • 2
  • 6
2
votes
1 answer

KeyError: 1 in using SP_LIME with Lightgbm

I am using SP_LIME explanation for results for churn prediction based on Lightgbm model. Using LIME explainer.explain_instance works OK. When I try SP_LIME on the same dataset first part sp_obj = submodular_pick.SubmodularPick(explainer,…
zdz
  • 307
  • 1
  • 2
  • 9
2
votes
1 answer

Using LIME for BERT transformer visualization results in memory error

Situation: I am currently working on visualizing the results of a huggingface transformers machine learning model I have been building using the LIME package following this tutorial. Complication: My code is set up and runs well until I create the…
Martin Reindl
  • 989
  • 2
  • 15
  • 33
2
votes
2 answers

How to use a 1D-CNN model in Lime?

I have a numeric health record dataset. I used a 1D CNN keras model for the classification step. I am giving a reproductible example in Python: import tensorflow as tf import keras from keras.models import Sequential from keras.layers import…
Noura
  • 474
  • 2
  • 11
2
votes
0 answers

Is there a way to add methods to a function within an R package?

I am trying to use lime to add ML explanations in the output of a package I am developing. My solution uses a gradient boosting model from library gbm. This type of model is not supported by lime as is, so I would need to add a gbm method to…
paolo
  • 33
  • 5
2
votes
1 answer

Key Error when using lime tabular explainer with Keras

I am trying to list feature importance of a Keras neural network regression model using Lime. I have tried a number of different variations of the code and keep getting some version of KeyError: 4 where the number is different. I have tried…
2
votes
4 answers

Feature names stored in `object` and `newdata` are different! when using LIME package to explain xgboost model in R

I'm trying to use LIME to explain a binary classification model that I've trained using XGboost. I run into an error when calling the explain() function from LIME, which implies that I have columns that aren't matching in my model (or explainer)…
Aidan Morrison
  • 118
  • 2
  • 6
2
votes
1 answer

LIME ImageExplanation - 'ImageExplanation' object has no attribute 'as_list'

I am trying to recover the weights used by LIME algorithm on the superpixels of an image. I am perfectly able to recover the map and the boundaries for the predictions, but not the weights. I have tried the command print(explanation.as_list()) but…
VGhidini
  • 31
  • 3
1
2 3 4 5 6 7