I would like to do hyperparameter training using the kerastuner framework.
How can I choose an optimizer and different learning rates
which can be passed to the optimizers.
This is my model.compile()
method.
model.compile(
loss=BinaryCrossentropy(from_logits=True),
optimizer=hp.Choice('optimizer', values=['adam', 'adagrad', 'SGD']),
metrics=['accuracy']
)
That code only picks one of the optimizers
at a time and would use the default learning rates.
I want to pass learning rates with hp.Float('lrate', min_value=1e-4, max_value=1e-2, sampling='LOG')
to each optimizer. How can i nest them.