optuna

How to record each fold`s validation loss during cross-validation in Optuna?

How to record each fold`s validation loss during cross-validation in Optuna? Question: I am using Toshihiko Yanase`s code for doing cross validation on my hyperparameter optimizer with Optuna. Here is the code that I am using: def objective(trial, train_loader, valid_loader): # Remove the following line. # train_loader, valid_loader = get_mnist() … return accuracy def objective_cv(trial): …

Total answers: 1

saving trained models in optuna to a variable

saving trained models in optuna to a variable Question: I have a project in which I do several optuna studies each having around 50 trials. The optuna documentation suggests saving each model to a file for later use on this FAQ section What I want is to have all the best models of different studies …

Total answers: 1

Is there a way for Optuna `suggest_categorical`to return multiple choices from list?

Is there a way for Optuna `suggest_categorical`to return multiple choices from list? Question: I am using Optuna for hyperparametrization of my model. And I have a field where I want to test multiple combinations from a list. For example: I have ["lisa","adam","test"] and I want suggest_categorical to return not just one, but a random combination: …

Total answers: 1

Optuna hyperparameter search not reproducible with interrupted / resumed studies

Optuna hyperparameter search not reproducible with interrupted / resumed studies Question: For big ML models with many parameters, it is helpful if one can interrupt and resume the hyperparameter optimization search. Optuna allows doing that with the RDB backend, which stores the study in a SQlite database (https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/001_rdb.html#sphx-glr-tutorial-20-recipes-001-rdb-py). However, when interrupting and resuming a study, …

Total answers: 1

Store user attributes in Optuna Sweeper plugin for Hydra

Store user attributes in Optuna Sweeper plugin for Hydra Question: How can I store additional information in an optuna trial when using it via the Hydra sweep plugin? My use case is as follows: I want to optimize a bunch of hyperparameters. I am storing all reproducibility information of all experiments (i.e., trials) in a …

Total answers: 1

How to set hidden_layer_sizes in sklearn MLPRegressor using optuna trial

How to set hidden_layer_sizes in sklearn MLPRegressor using optuna trial Question: I would like to use [OPTUNA][1] with sklearn [MLPRegressor][1] model. For almost all hyperparameters it is quite straightforward how to set OPTUNA for them. For example, to set the learning rate: learning_rate_init = trial.suggest_float(‘learning_rate_init ‘,0.0001, 0.1001, step=0.005) My problem is how to set it …

Total answers: 1

How to optimize for multiple metrics in Optuna

How to optimize for multiple metrics in Optuna Question: How do I optimize for multiple metrics simultaneously inside the objective function of Optuna. For example, I am training an LGBM classifier and want to find the best hyperparameter set for all common classification metrics like F1, precision, recall, accuracy, AUC, etc. def objective(trial): # Train …

Total answers: 1

How to suggest multivariate of ratio (with bound) in Optuna?

How to suggest multivariate of ratio (with bound) in Optuna? Question: I would like to suggest ratios in Optuna. The ratios are X_1, X_2, …, X_k, bounded by ∑X_i = 1 and 0 <= X_i <= 1 for all i. Optuna does not provide a Dirichlet distribution. I have tried the following code, but it …

Total answers: 1

Is there a way to pass arguments to multiple jobs in optuna?

Is there a way to pass arguments to multiple jobs in optuna? Question: I am trying to use optuna for searching hyper parameter spaces. In one particular scenario I train a model on a machine with a few GPUs. The model and batch size allows me to run 1 training per 1 GPU. So, ideally …

Total answers: 3