6

I would like to do hyperparameter training using the kerastuner framework.

How can I choose an optimizer and different learning rates which can be passed to the optimizers. This is my model.compile()method.

        model.compile(
        loss=BinaryCrossentropy(from_logits=True),
        optimizer=hp.Choice('optimizer', values=['adam', 'adagrad', 'SGD']),
        metrics=['accuracy']
        )

That code only picks one of the optimizers at a time and would use the default learning rates. I want to pass learning rates with hp.Float('lrate', min_value=1e-4, max_value=1e-2, sampling='LOG') to each optimizer. How can i nest them.

Landau
  • 121
  • 7

1 Answers1

7

Try with this:

# Select optimizer    
optimizer=hp.Choice('optimizer', values=['adam', 'adagrad', 'SGD']

# Conditional for each optimizer
if optimizer == 'adam':
   .....
elif optimizer == 'adagrad':
   .....
elif optimizer == 'SGD':
   .....

# Now compile your model with previous param
model.compile(
    loss=BinaryCrossentropy(from_logits=True),
    optimizer=optimizer,
    metrics=['accuracy']
    )
Francisco Gonzalez
  • 437
  • 1
  • 3
  • 15