0

I want to write a custom activation function with keras.backend for last dense of LSTM like this:

def customactivation(x):
    if x <= 0.5:
        return 0
    else :
        return 1



model.add(Dense(1, activation=customactivation))

what should I do?

Sheida
  • 1
  • Does this answer your question? [How do you create a custom activation function with Keras?](https://stackoverflow.com/questions/43915482/how-do-you-create-a-custom-activation-function-with-keras) – jyr Feb 13 '20 at 15:58

1 Answers1

0

This function is not differentiable and will be useless for training unless you know what you're doing. You will get the error "An operation has None for gradient"

That said:

def customactivation(x):
    return K.cast(K.greater(x, 0.5), K.floatx())
Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • yeah I used this but it gave me this error: ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval. – Sheida Feb 13 '20 at 15:45
  • Yes, exactly what was expected. – Daniel Möller Feb 13 '20 at 16:53
  • after add this: `get_custom_objects().update({'custom_activation': Activation(custom_activation)})` , it works! (it was mentioned in [link](https://stackoverflow.com/questions/43915482/how-do-you-create-a-custom-activation-function-with-keras) ) – Sheida Feb 13 '20 at 22:36