I'm trying to use leaky relu. I tried using the mtd given by
Keras Functional API and activations
It doesn't work. I got the error:
TypeError: activation() missing 1 required positional argument: 'activation_type'
Also, should Activation be capital throughout or not?
I use it as:
def activation(x, activation_type):
if activation_type == 'leaky_relu':
return activations.relu(x, alpha=0.3)
else:
return activations.get(activation_type)(x)
...
input_data = layers.Input(shape=(3,))
...
hiddenOut = Dense(units=2)(input_data)
hiddenOut = activation(lambda hiddenOut: activation(hiddenOut, 'LeakyReLU'))(hiddenOut)
u_out = Dense(1, activation='linear', name='u')(hiddenOut)
...