Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

191 questions
18
votes
3 answers

Python: How to retrive the best model from Optuna LightGBM study?

I would like to get the best model to use later in the notebook to predict using a different test batch. reproducible example (taken from Optuna Github) : import lightgbm as lgb import numpy as np import sklearn.datasets import sklearn.metrics from…
HarriS
  • 605
  • 1
  • 6
  • 19
10
votes
1 answer

The default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'

I am trying to fit XGBClassifier to my dataset after hyperparameter tuning using optuna and I keep getting this warning: the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss' Below is my…
spectre
  • 717
  • 7
  • 21
9
votes
2 answers

Optuna suggest float log=True

How can I have optuna suggest float numeric values from this list: [1e-6, 1e-5, 1e-4, 1e-3, 1e-2, 1e-1, 1.0] I'm using this Python code snippet: trial.suggest_float("lambda", 1e-6, 1.0, log=True) It correctly suggests values between 1e-6 and 1.0,…
steve
  • 583
  • 1
  • 5
  • 11
8
votes
1 answer

How to optimize for multiple metrics in Optuna

How do I optimize for multiple metrics simultaneously inside the objective function of Optuna. For example, I am training an LGBM classifier and want to find the best hyperparameter set for all common classification metrics like F1, precision,…
Bex T.
  • 1,062
  • 1
  • 12
  • 28
8
votes
2 answers

How to set optuna's study.optimize verbosity to 0?

I want to set optuna's study.optimize verbosity to 0. I thought optuna.logging.set_verbosity(0) might do it, but I still get the Trial 0 finished with value .... updates for every trial What is the correct way to do this? Unfortunately, extensive…
Olli
  • 906
  • 10
  • 25
7
votes
2 answers

Trial 1 failed, because the value None could not be cast to float

I am trying to tune an extra tree classifier with Optuna. I am getting this message to all my trials: [W 2022-02-10 12:13:12,501] Trial 2 failed, because the value None could not be cast to float. Below is my code. It happens to all my trials. Can…
Kyriakos
  • 181
  • 1
  • 8
7
votes
2 answers

Optuna Suggests the Same Parameter Values in a lot of Trials (Duplicate Trials that Waste Time and Budget)

Optuna TPESampler and RandomSampler try the same suggested integer values (possible floats and loguniforms as well) for any parameter more than once for some reason. I couldn't find a way to stop it from suggesting same values over over again. Out…
7
votes
1 answer

How can I cross-validate by Pytorch and Optuna

I want to use cross-validation against the official Optuna and pytorch-based sample code (https://github.com/optuna/optuna/blob/master/examples/pytorch_simple.py). I thought about splitting the data for cross-validation and trying parameter tuning…
sta
  • 71
  • 1
  • 2
5
votes
1 answer

Optuna LightGBM LightGBMPruningCallback

I am getting an error on my modeling of lightgbm searching for optimal auc. Any help would be appreciated. import optuna from sklearn.model_selection import StratifiedKFold from optuna.integration import LightGBMPruningCallback def…
Tinkinc
  • 449
  • 2
  • 8
  • 21
5
votes
1 answer

Suppress LightGBM warnings in Optuna

I am getting below warnings while I am using Optuna to tune my model. Please tell me how to suppress these warnings? [LightGBM] [Warning] feature_fraction is set=0.2, colsample_bytree=1.0 will be ignored. Current value:…
ffl
  • 91
  • 1
  • 4
5
votes
1 answer

Why Optuna getting stuck after certain number of trials

I am trying to do hypertuning using Optuna. The dataset is the MovieLense (1M). In one script I have Lasso, Ridge and Knn. Optuna is working fine for the Lasso and Ridge but getting stuck for the Knn. You can see the trials for the Ridge model…
0Knowledge
  • 747
  • 3
  • 14
5
votes
2 answers

What is the difference between alpha, lambda and gamma regularization parameters for xgboost?

I have a question to ask: How exactly are different L1 and L2 regularization terms on weights in xgboost algorithm. As I understand, L1 is used by LASSO and L2 is used by RIDGE regression and L1 can shrink to 0, L2 can't. I understand the mechanics…
Vojtech Stas
  • 631
  • 8
  • 22
5
votes
4 answers

Optuna pass dictionary of parameters from "outside"

I am using Optuna to optimize some objective functions. I would like to create my custom class that "wraps" the standard Optuna code. As an example, this is my class(it is still a work in progress!): class Optimizer(object): def…
Mattia Surricchio
  • 1,362
  • 2
  • 21
  • 49
5
votes
3 answers

Is there a way to pass arguments to multiple jobs in optuna?

I am trying to use optuna for searching hyper parameter spaces. In one particular scenario I train a model on a machine with a few GPUs. The model and batch size allows me to run 1 training per 1 GPU. So, ideally I would like to let optuna spread…
mRcSchwering
  • 820
  • 8
  • 25
5
votes
2 answers

How to sample parameters without duplicates in optuna?

I am using optuna for parameter optimisation of my custom models. Is there any way to sample parameters until current params set was not tested before? I mean, do try sample another params if there were some trial in the past with the same set of…
roseaysina
  • 135
  • 1
  • 9
1
2 3
12 13