1

I have the following Keras (Python) code for using the Keras Tuner for a single hidden layer neural network.

def build_model(hp):
  
  # Initialize sequential
  model = keras.Sequential()

  # Tune the number of units in the first Dense layer
  hp_units = hp.Int('units', min_value=25, max_value=250, step=25)
  
  model.add(keras.layers.Dense(units=hp_units, input_shape = (train_pca_scaled.shape[1],),
                               kernel_regularizer=keras.regularizers.L1(l1 = hp.Choice('L1_value', [1e-2, 1e-3, 1e-4]))))
  
    
  # Batch Normalization (before activation)
  model.add(keras.layers.BatchNormalization())
  
  # Activation
  activation = hp.Choice("activation", ["relu", "elu"])
  model.add(keras.layers.Activation(activation))
  
  # Tune dropout rate
  hp_dropout = hp.Float('dropout_rate', min_value=0, max_value=0.5, step=0.1)
  model.add(keras.layers.Dropout(rate = hp_dropout))
        
  # Final output layer
  model.add(keras.layers.Dense(units = 1))
      
  # Compile
  lr = hp.Choice("learning_rate", values=[1e-2, 1e-3, 1e-4])
  model.compile(optimizer = keras.optimizers.Adam(learning_rate= lr),
                   loss = keras.losses.MeanSquaredError(), 
                   metrics= [keras.metrics.RootMeanSquaredError()])

  return model

My issue is that I would like to also have the advanced activation layers of LeakyReLU and PReLU as activation options (along with relu and elu), but they have a different call. For advanced activations layers, it would be keras.layers.LeakyReLU() for example, which is different than the format I have above of keras.layers.Activation(activation).

How can I change the code above so that all 4 of the mentioned activation layers are included in the hyperparameter search?

Thanks.

pdhami
  • 187
  • 1
  • 9

0 Answers0