0

I have a machine learning problem and want to optimize my SVC estimators as well as the feature selection.

For optimizing SVC estimators I use essentially the code from the docs. Now my question is, how can I combine this with recursive feature elimination cross validation (RCEV)? That is, for each estimator-combination I want to do the RCEV in order to determine the best combination of estimators and features.

I tried the solution from this thread, but it yields the following error:

ValueError: Invalid parameter C for estimator RFECV. Check the list of available parameters with `estimator.get_params().keys()`.

My code looks like this:

tuned_parameters = [{'kernel': ['rbf'], 'gamma': [1e-4,1e-3],'C': [1,10]},
                    {'kernel': ['linear'],'C': [1, 10]}]

estimator = SVC(kernel="linear")
selector = RFECV(estimator, step=1, cv=3, scoring=None)
clf = GridSearchCV(selector, tuned_parameters, cv=3)
clf.fit(X_train, y_train)

The error appears at clf = GridSearchCV(selector, tuned_parameters, cv=3).

Community
  • 1
  • 1
beta
  • 5,324
  • 15
  • 57
  • 99

1 Answers1

1

I would use a Pipeline, but here you have a more adequate response

Recursive feature elimination and grid search using scikit-learn

Community
  • 1
  • 1
Toni Piza
  • 505
  • 3
  • 11
  • thanks. i also got it to work with this tutorial: https://civisanalytics.com/blog/data-science/2016/01/06/workflows-python-using-pipeline-gridsearchcv-for-compact-code/ (it also uses pipelines). – beta Jul 26 '16 at 14:56