I see in some places that you need to define the derivative function for your custom activation. Is this true? or is all you need to do just pass a tensorflow-compatible function to the wrapper and tensorflow.keras takes care of the rest?
Ie.
def my_actv(x):
return x * x
model.add(Activation(my_actv))