0

The active function in my CNN has the form:

abs(X)< tou  f = 1.716tanh(0.667x)
x >= tou     f = 1.716[tanh(2tou/3)+tanh'(2tou/3)(x-tou)]
x <= -tou    f = 1.716[tanh(-2tou/3)+tanh'(-2tou/3)(x+tou)]

tou is a constant.

So, in TensorFlow it is possible to make your own activation function. I don't want to write it in C++ and recompile the whole of TensorFlow.

How can I use the function available in TensorFlow to achieve it?

DL-Lily
  • 1
  • 3

1 Answers1

0

In tensorflow it is easy to write your own activation function if it's include already existed ops, for your case you can use tf.case

f = tf.case({tf.less(tf.abs(x), tou): lambda: 7.716 * tf.tanh(0.667 * x),
         tf.greater_equal(x, tou): lambda: 1.716 * tf.tanh(2 * tou / 3) + 1.716 * tf.tanh(2 * tou / 3) * (x - tou)},
        default=lambda: 1.716 * tf.tanh(-2 * tou / 3) + 1.716 * tf.tanh(-2 * tou / 3) * (x + tou), exclusive=True)
Ishant Mrinal
  • 4,898
  • 3
  • 29
  • 47