2

I am running XGBRegressor with BayesSearchCV from skopt for parameter tuning

opt_xgb_model_tuned = xgboost.XGBRegressor()

hyper_space = {
'booster': ['gbtree'],
'objective': ['reg:squarederror'],
'learning_rate': [0.005, 0.01, 'log-uniform'],
'max_depth': [8, 12],
'min_child_weight': [0, 10],
'gamma': [0.01, 10, 'log-uniform'],
'subsample': [0.0001, 1, 'uniform'],
'colsample_bytree': [0.001, 1.0, 'uniform'],
'reg_lambda': [0.01, 50, 'log-uniform'],
'reg_alpha': [0.001, 1, 'log-uniform'],
'max_delta_step': [0, 20],
'n_estimators': [500, 2000],
}

gs = BayesSearchCV(opt_xgb_model_tuned, hyper_space, n_iter=32, random_state=0)
gs_res = gs.fit(X_train, y_train)

c:\users\joel thomas wilson\anaconda_python\py2020\envs\optimus_prime\lib\site-packages\skopt\utils.py in check_x_in_space(x, space) 184 if is_2Dlistlike(x): 185 if not np.all([p in space for p in x]): --> 186 raise ValueError("Not all points are within the bounds of" 187 " the space.") 188 if any([len(p) != len(space.dimensions) for p in x]):

ValueError: Not all points are within the bounds of the space.

Any clue on the rules/ranges we search for each of these parameters and is that dependent on the X ranges? like my model target is in the range of [-1, 1]. Does that have to do anything with this?

joel.wilson
  • 8,243
  • 5
  • 28
  • 48

1 Answers1

2

Your model target does not impact that range problem.

I would suggest specifying the data type of your hyper_space search space using the built-in data space classes.

For example:

from skopt.space import Real, Categorical, Integer


hyper_space = {
    'booster': Categorical(['gbtree']),
    'learning_rate': Real(0.005, 0.01, 'log-uniform'),
    'max_depth': Integer(8, 12, 'uniform'),
}

If you are still getting the same problem, you can always try to repeatedly comment out some of the search variables to find the culprit that throws the error.

Dharman
  • 30,962
  • 25
  • 85
  • 135
chogall
  • 21
  • 3