I am not familiar with Optuna but I ran into this issue using Python/lightgbm.
As of v3.3.2, the parameters tuning page included parameters that seem to be renamed, deprecated, or duplicative. If, however, you stick to setting/tuning the parameters specified in the model object you can avoid this warning.
from lightgbm import LGBMRegressor
params = LGBMRegressor().get_params()
print(params)
These are the only parameters you want to set. If you want to be able to include all the parameters, you could do something like below.
from lightgbm import LGBMRegressor
lgr = LGBMRegressor()
params = lgr.get_params()
aliases = [
{'min_child_weight', 'min_sum_hessian_in_leaf'},
{'min_child_samples', 'min_data_in_leaf'},
{'colsample_bytree', 'feature_fraction'},
{'subsample', 'bagging_fraction'}
]
for alias in aliases:
if len(alias & set(params)) == 2:
arg = random.choice(sorted(alias))
params[arg] = None
lgr = LGBMRegressor(**params)
The code sets one or the other in each parameter pair that seems to be duplicative. Now, when you call lgr.fit(X, y)
you should not get the warning.