1

I am using BayesSearchCV in order to find best HyperParameters using Bayssian Optimization. The syntax for using the BayesSearchCV looks like the following:

clas_model = LogisticRegression(max_iter=5000)
search_space =
{
    "penalty": Categorical(['l1', 'l2', 'elasticnet', 'none']),
    "solver": Categorical(['lbfgs', 'newton-cg', 'liblinear', 'sag', 'saga']),
    "fit_intercept": Categorical([True, False])
}


bayes_search = BayesSearchCV(clas_model, search_space, n_iter=12, scoring="accuracy", n_jobs=-1, cv=5)
bayes_search.fit(X, y.values.ravel(), callback=on_step)
predictions_al = cross_val_predict(bayes_search, X, y.values.ravel(), cv=folds)

In this case, the solver 'newton-cg' does not accept penalty 'l1', so there is a dependency between hyperparameters. Is any way to configure this using this library?

Stavros Koureas
  • 1,126
  • 12
  • 34

1 Answers1

0

By looking to other libraries like GridCV or RandomCV i realized that we can provide different search_spaces into an array while this is not documented in BayssianCV but there are some quick examples in the other libraries without details.

Finally because of high dependency between solvers, penalties, and other parameters (described here https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) the syntax should be like the following:

clas_model = LogisticRegression(max_iter=5000)
search_space =
  [
   {
    "solver": Categorical(['liblinear']),
    "penalty": Categorical(['l1', 'l2']),
    "fit_intercept": Categorical([True, False]),
    #"warm_start": Categorical([True, False])
   },
   {
    "solver": Categorical(['lbfgs', 'newton-cg', 'sag']),
    "penalty": Categorical(['l2', 'none']),
    "fit_intercept": Categorical([True, False]),
    #"warm_start": Categorical([True, False])
   },
   {
    "solver": Categorical(['saga']),
    "penalty": Categorical(['l1', 'l2', 'none']),
    "fit_intercept": Categorical([True, False]),
    #"warm_start": Categorical([True, False])
   },
   {
    "solver": Categorical(['saga']),
    "penalty": Categorical(['elasticnet']),
    "fit_intercept": Categorical([True, False]),
    "l1_ratio": Real(0, 1, prior='uniform'),
    #"warm_start": Categorical([True, False])
   },
  ]

The ‘newton-cg’, ‘sag’, and ‘lbfgs’ solvers support only L2 regularization with primal formulation, or no regularization. The ‘liblinear’ solver supports both L1 and L2 regularization, with a dual formulation only for the L2 penalty. The Elastic-Net regularization is only supported by the ‘saga’ solver.

Stavros Koureas
  • 1,126
  • 12
  • 34