3

I am experimenting replacing the Keras sigmoid function with a piecewise linear function defined as:

def custom_activation_4(x):
if x < -6:
    return 0
elif x >= -6 and x < -4:
    return (0.0078*x + 0.049)
elif x >= -4 and x < 0:
    return (0.1205*x + 0.5)
elif x >= 0 and x < 4:
    return (0.1205*x + 0.5)
elif x >= 4 and x < 6:
    return (0.0078*x + 0.951)
else:
    return 1;

When I try to run this as:

classifier_4.add(Dense(output_dim = 18, init = 'uniform', activation = custom_activation_4, input_dim = 9))

The compiler throws an error saying:

Using a `tf.Tensor` as a Python `bool` is not allowed.

I researched this and learned that, I am treating the variable x as a simple python variable whereas it is a tensor. That is the reason it cannot be treated like a simple boolean variable. I also tried using the tensorflow cond method. How to treat and use x as tensor here? Thanks a ton in advance for all the help.

Milo Lu
  • 3,176
  • 3
  • 35
  • 46
Swapnil B.
  • 729
  • 1
  • 8
  • 23
  • You need to use something like `tf.cond` instead of `if` statements. Something like `v = tf.cond(x<-6, 0, 0.0078*x + 0.049)`. You may need to use `from keras import backend as K` and then use `K.cond` instead of `tf.cond`. – user12075 Sep 21 '18 at 05:38
  • but `tf.cond` will not give the if else kind of check right? `tf.cond(x<-6, 0, 0.0078*x + 0.049)` here if x is greater than -6 and also greater than -4 it will still give `0.0078*x + 0.049` instead of `0.1205*x + 0.5`. Any ideas? – Swapnil B. Sep 21 '18 at 05:46
  • That's only an example, not the complete code of course. You have multiple conditions so you need to nest them together, like `tf.cond(x<-6, 0, tf.cond(x<-4, 0.0078*x + 0.049, tf.cond(x<0, 0.1205*x + 0.5, tf.cond(blablabla))))` – user12075 Sep 21 '18 at 05:53
  • @user12075 yeah thanks I wrote the same thing. Thanks for a quick reply. – Swapnil B. Sep 21 '18 at 05:55

2 Answers2

2

Your custom activation is written as a function of a single floating point number but you want to apply it to a whole tensor. The best way to do that is to use tf.where. Something like

def custom_activation_4(x):
  orig = x
  x = tf.where(orig < -6, tf.zeros_like(x), x)
  x = tf.where(orig >= -6 and orig < -4, (0.0078*x + 0.049), x)
  x = tf.where(orig >= -4 and orig < 0, (0.1205*x + 0.5), x)
  x = tf.where(orig >= 0 and orig < 4, (0.1205*x + 0.5), x)
  x = tf.where(orig  >= 4 and orig < 6, (0.0078*x + 0.951), x)
  return tf.where(orig >= 6, 1, x)
Alexandre Passos
  • 5,186
  • 1
  • 14
  • 19
  • thank you so much for this brilliant answer. I so wish I had gotten this two days ago. I managed to do the function using Keras backend and tensor arithmetic. – Swapnil B. Sep 24 '18 at 23:18
  • In this solution, each and every time the x is getting updated and the new x value is found which is not right. Once the x value is updated it should go to the return line and then exit from the function. – Ganesh M S Sep 27 '19 at 02:53
2

I tested the code in the answer because I intend to write a similar activation function, yet the following error happened

raise TypeError("Using a tf.Tensor as a Python bool is not allowed. " TypeError: Using a tf.Tensor as a Python bool is not allowed. Use if t is not None: instead of if t: to test if a tensor is defined, and use TensorFlow ops such as tf.cond to execute subgraphs conditioned on the value of a tensor

The reason is that we cannot use Python logical operators on tf.Tensor. So I did some search in tf doc, and it turns out that we have to use theirs like this, which is my code, yet very very similar to yours.

import tensorflow as tf

class QPWC(Layer):


    def __init__(self, sharp=100, **kwargs):
        super(QPWC, self).__init__(**kwargs)
        self.supports_masking = True
        self.sharp = K.cast_to_floatx(sharp)

    def call(self, inputs):

        orig = inputs
        inputs = tf.where(orig <= 0.0, tf.zeros_like(inputs), inputs)
        inputs = tf.where(tf.math.logical_and(tf.greater(orig, 0), tf.less(orig, 0.25)), 0.25 / (1+tf.exp(-self.sharp*((inputs-0.125)/0.5))), inputs)
        inputs = tf.where(tf.math.logical_and(tf.greater(orig, 0.25), tf.less(orig, 0.5)), 0.25 / (1+tf.exp(-self.sharp*((inputs-0.5)/0.5))) + 0.25, inputs)
        inputs = tf.where(tf.math.logical_and(tf.greater(orig, 0.5), tf.less(orig, 0.75)), 0.25 / (1+tf.exp(-self.sharp*((inputs-0.75)/0.5))) + 0.5, inputs)
        return  tf.where(tf.greater(orig, 0.75), tf.ones_like(inputs), inputs)


    def get_config(self):
        config = {'sharp': float(self.sharp)}
        base_config = super(QPWC, self).get_config()
        return dict(list(base_config.items()) + list(config.items()))

    def compute_output_shape(self, input_shape):
        return input_shape

Theron
  • 567
  • 1
  • 7
  • 21