1

I'm trying to use leaky relu. I tried using the mtd given by

Keras Functional API and activations

It doesn't work. I got the error:

TypeError: activation() missing 1 required positional argument: 'activation_type'

Also, should Activation be capital throughout or not?

I use it as:

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)
...

input_data = layers.Input(shape=(3,))
...
hiddenOut = Dense(units=2)(input_data)
hiddenOut = activation(lambda hiddenOut: activation(hiddenOut, 'LeakyReLU'))(hiddenOut)
u_out = Dense(1, activation='linear', name='u')(hiddenOut)   
...
quarkz
  • 111
  • 7

2 Answers2

2

You're doing something extra complicated, you can just

hiddenOut = keras.layers.LeakyReLU(alpha=0.3)(hiddenOut)
Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
1
import keras

def my_activation(x, activation_type):
    if activation_type == 'LeakyReLU':
        return keras.activations.relu(x, alpha=0.3)
    else:
        return keras.activations.get(activation_type)(x)

input_data = keras.layers.Input(shape=(3,))
hiddenOut = keras.layers.Dense(units=2)(input_data)
hiddenOut = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'LeakyReLU'))(hiddenOut)

Why

  • Activation is a layer and activations is a set of available activation's.
  • To emulate Leaky ReLu we have to change the slope of the negative part. The slope is 0 for ReLu and this can be changed using the alpha parameter.
  • What we are doing is a write a wrapper function called my_activation which will return a Leaky ReLu with negative slope of 0.3 if the parameter is LeakyReLU else it will return the normal activation.

Example:

input_data = keras.layers.Input(shape=(3,))
a = keras.layers.Dense(units=2)(input_data)
a = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'LeakyReLU'))(a)
a = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'sigmoid'))(a)
a = keras.layers.Activation(lambda hiddenOut: my_activation(hiddenOut, 'tanh'))(a)
mujjiga
  • 16,186
  • 2
  • 33
  • 51
  • Ya, it worked! Thanks alot. Guess I may have missed out importing the libraries and mixing the capital letters etc. – quarkz Apr 07 '20 at 14:13
  • Hey @quarkz, welcome to Stack Overflow. If an answer works for you, you can mark it as answered, this helps other users and runs our reputation system. – Daniel Möller Apr 07 '20 at 16:50