I have a config tree such as:
config.yaml
model/
model_a.yaml
model_b.yaml
model_c.yaml
Where config.yaml contains:
# @package _global_
defaults:
- _self_
- model: model_a.yaml
some_var: 42
I would like to access the name of the model…
I have a main config file, let's say config.yaml:
num_layers: 4
embedding_size: 512
learning_rate: 0.2
max_steps: 200000
I'd like to be able to override this, on the command-line, with another file, like say big_model.yaml, which I'd use…
This is the files directory:
|-configs
|----data_conf
|--------csv_images.csv
|--------tf_ds.csv
|----example.yaml
and example.yaml is:
data: csv_images
defaults:
- data_conf: "${data}"
and csv_images.yaml:
# @package _group_
a: test_a
b:…
I'm using hydra to log hyperparameters of experiments.
@hydra.main(config_name="config", config_path="../conf")
def evaluate_experiment(cfg: DictConfig) -> None:
print(OmegaConf.to_yaml(cfg))
...
Sometimes I want to do a dry run to check…
Is there any option to influence how lists are merged in OmegaConf.
Ideally, this could be controlled in the config file, but if there is come switch I can use in the code I'm also interested.
Example:
from omegaconf import OmegaConf
conf1 =…
What I'm trying to do: use environment variables in a Hydra config.
I worked from the following links: OmegaConf: Environment variable interpolation and Hydra: Job Configuration.
This is my config.yaml:
hydra:
job:
env_copy:
- EXPNAME
#…
I am trying to do a basic hyperparameter tuning. By default Hydra creates a permutation of each hyperparameter.
hydra:
mode: MULTIRUN
sweeper:
params:
+n: 5,10,15
+a_lower: 0.5, 0.7, 0.9
Here it will run the same…
I instantiate a hydra configuration from a python dataclass. For example
from dataclasses import dataclass
from typing import Any
from hydra.utils import instantiate
class Model():
def __init__(self, x=1):
self.x = x
@dataclass
class…
I have a configuration similar to this from a yaml file
training_variables:
- var1
- var2
I want to extend the list using an additional variable, and I want to do it from the command line. How to do it? It seems not possible but I think it…
I have 2 sub configs and one master(?) config that having those sub configs. I designed configs like below:
from dataclasses import dataclass, field
import hydra
from hydra.core.config_store import ConfigStore
from omegaconf import MISSING,…
I have the following code, using the hydra framework
# dummy_hydra.py
from dataclasses import dataclass
import hydra
from hydra.core.config_store import ConfigStore
from omegaconf import DictConfig, OmegaConf
@dataclass
class Foo:
x: int =…
I am trying to use Hydra 1.3 contrive a simple, but apparently not trivial, configuration that maps endpoints of a given API to their corresponding processing functions.
So far, I came up with a config folder structure that looks like:
$ tree…
I want to do hyperparameter tuning for a neural net, created with keras. For this project I handle my config.yaml files with hydra, use mlflow to store the metrics and parameters from the optimization and use ray to parallelize the computation of…
I'm pretty new to hydra and was wondering if the following thing is was possible: I have the parameter num_atom_feats in the model section which I would like to make dependent on the feat_type parameter in the data section. In particular, if I have…