0

I would like to know if it's possible to define a custom activation function like tanh to constraints the 12 outputs of the dense layers or fully connected layer like that:

If X[0:6] > 1 then X[0:6] else 1
If X[6:12]  < 0 then X[6:12] else 0

X is my denseLayer.outputs

I found some tutorials to create custom activation function with gradient like LeakyRelu6 on Medium for example, but it's always with a constraints on all the outputs.

In the tutorials, what they are doing is :

If X >= 0 and X <= 6 then X
Elif X >6 then 6
Else 0.2*X

To sum up, i would like to understand how can i have an activation function which constraints X like in my exemple ?

The link: https://medium.com/@chinesh4/custom-activation-function-in-tensorflow-for-deep-neural-networks-from-scratch-tutorial-b12e00652e24

SkyEeros
  • 1
  • 2

0 Answers0