I am running XGBRegressor
with BayesSearchCV
from skopt
for parameter tuning
opt_xgb_model_tuned = xgboost.XGBRegressor()
hyper_space = {
'booster': ['gbtree'],
'objective': ['reg:squarederror'],
'learning_rate': [0.005, 0.01, 'log-uniform'],
'max_depth': [8, 12],
'min_child_weight': [0, 10],
'gamma': [0.01, 10, 'log-uniform'],
'subsample': [0.0001, 1, 'uniform'],
'colsample_bytree': [0.001, 1.0, 'uniform'],
'reg_lambda': [0.01, 50, 'log-uniform'],
'reg_alpha': [0.001, 1, 'log-uniform'],
'max_delta_step': [0, 20],
'n_estimators': [500, 2000],
}
gs = BayesSearchCV(opt_xgb_model_tuned, hyper_space, n_iter=32, random_state=0)
gs_res = gs.fit(X_train, y_train)
c:\users\joel thomas wilson\anaconda_python\py2020\envs\optimus_prime\lib\site-packages\skopt\utils.py in check_x_in_space(x, space) 184 if is_2Dlistlike(x): 185 if not np.all([p in space for p in x]): --> 186 raise ValueError("Not all points are within the bounds of" 187 " the space.") 188 if any([len(p) != len(space.dimensions) for p in x]):
ValueError: Not all points are within the bounds of the space.
Any clue on the rules/ranges we search for each of these parameters and is that dependent on the X
ranges? like my model target is in the range of [-1, 1]. Does that have to do anything with this?