1

My tensorflow is 2.4.0 , but i am using the code in tensorflow 1 base(sess.run), and I want to change my network activation function to my custom type below , which define in the same class

 def new_relu10000(self, x, k=1):
    part_1 = tf.cast(tf.math.logical_and(tf.math.less_equal(0.0, x), tf.math.less(x, 10000.0)), dtype=tf.float32)
    part_3 = tf.cast(tf.math.less_equal(10000.0, x), dtype=tf.float32)
    return part_1*x*k + part_3*10000.0

and my network is define as

 def _build_a(self, s, scope, trainable):
        with tf.compat.v1.variable_scope(scope):
            net = tf.compat.v1.layers.dense(s, 400, activation=tf.nn.relu, name='l1', trainable=trainable)
            net2 = tf.compat.v1.layers.dense(net,300, activation=tf.nn.relu, name='l2', trainable=trainable)
            a = tf.compat.v1.layers.dense(net2, self.a_dim, activation=self.new_relu10000, name='a', trainable=trainable)
            return  a

And this code can compile and run successfully(the output of a will never bigger than 10000) , however, I was concerning because I see there are some issue ask by other says that we need to define a gradient of custom activation function , such as How to make a custom activation function with only Python in Tensorflow?. But I also see the answer in here Tensorflow custom activation function says that it will automatic differentiation when using tensor type, but the answer in the same question below says that need to define the weight of custom activation function

So I want to ask is my activation function above is define well and can just use it directly? Or I need to define the gradient part like this question answer How to make a custom activation function with only Python in Tensorflow? ? Or I need to create weights for the activation function Just like the solution here? Tensorflow custom activation function

Johnny lin
  • 49
  • 6

0 Answers0