I trained a model with RBF kernel-based support vector machine regression. I want to know the features that are very important or major contributing features for the RBF kernel-based support vector machine. I know there is a method to know the most contributing features for linear support vector regression based on weight vectors which are the size of the vectors. However, for the RBF kernel-based support vector machine, since the features are transformed into a new space, I have no clue how to extract the most contributing features. I am using scikit-learn in python. Is there a way to extract the most contributing features in RBF kernel-based support vector regression or non-linear support vector regression?
from sklearn import svm
svm = svm.SVC(gamma=0.001, C=100., kernel = 'linear')
In this case: Determining the most contributing features for SVM classifier in sklearn does work very well. However, if the kernel is changed in to
from sklearn import svm
svm = svm.SVC(gamma=0.001, C=100., kernel = 'rbf')
The above answer doesn't work.