0

I see in some places that you need to define the derivative function for your custom activation. Is this true? or is all you need to do just pass a tensorflow-compatible function to the wrapper and tensorflow.keras takes care of the rest?

Ie.

def my_actv(x):
    return x * x

model.add(Activation(my_actv))
Ben Arnao
  • 88
  • 1
  • 9
  • You don't have to implement a derivative of the activation function. However, check this [question](https://stackoverflow.com/questions/43915482/how-do-you-create-a-custom-activation-function-with-keras) – Zaher88abd Jan 23 '20 at 19:25
  • I have seen that answer. What are you telling me to look at and why do some sources mention define the derivative function? – Ben Arnao Jan 24 '20 at 19:18

1 Answers1

0

The derivative needs to be defined if your function is not differentiable at every point. For example, relu is not differentiable at zero.

Susmit Agrawal
  • 3,649
  • 2
  • 13
  • 29