I would like to add a node-specific variable to each Keras activation function. I would like each node to compute the activation value (output) with a different value (alpha
).
This can be done globally, for example with the alpha
parameter for the relu activation function (link):
# Build Model
...
model.add(Dense(units=128))
model.add(Activation(lambda x: custom_activation(x, alpha=0.1)))
...
I can also write a custom activation function, but the alpha
parameter is also global. (link)
# Custom activation function
def custom_activation(x, alpha=0.0):
return (K.sigmoid(x + alpha))
# Build Model
...
model.add(Dense(units=128))
model.add(Activation(lambda x: custom_activation(x, alpha=0.1)))
...
Inside the custom function I only currently have access to the following variables:
(Pdb) locals()
{'x': <tf.Tensor 'dense/Identity:0' shape=(None, 128) dtype=float32>, 'alpha': 0.1}
I would like to use a custom activation function, but for alpha
to be unique to each node in the network. For example, if there are 128 units in the layer, then I would like there also to be 128 values of alpha, one for each unit / node. I would then like the activation function to
How do I create an alpha
value that is unique to each unit / node in a layer?