2

I am building a deconvolution network. I would like to add a layer to it which is the reverse of a softmax. I tried to write a basic python function that returns the inverse of a softmax for a given matrix and put that in a tensorflow Lambda and add it to my model. I have no error but when I doing a predict I only have 0 at the exit. When I don't add this layer to my network I have output something other than zeros. This therefore justifies that they are due to my inv_softmax function which is bad. Can you enlighten me how to proceed?

I define my funct as this :

def inv_softmax(x):
   C=0
   S = np.zeros((1,1,10)) #(1,1,10) is the shape of the datas that my layer will receive
   try:
      for j in range(np.max(np.shape(x))):
         C+=np.exp(x[0,0,j])
      for i in range(np.max(np.shape(x))):
         S[0,0,i] = np.log(x[0,0,i]+C
   except ValueError:
      print("ValueError in inv_softmax")
      pass
   S = tf.convert_to_tensor(S,dtype=tf.float32)
   return S

I add it to my network as :

x = ...
x = layers.Lambda(lambda x : inv_softmax(x),name='inv_softmax',output_shape=[1,1,10])(x)
x = ...

If you need more of my code or others informations ask me please.

Benjamin
  • 33
  • 1
  • 5
  • Another information (maybe useful), when I have an array of shape (1,1,10) for example with values ​​into it and I do tf.convert_to_tensor (My_array, dtype = tf.float32) and print this thing I only get "tf.Tensor 'Const' shape (1,1,10) dtype = float32" and no values. When I check on the internet people get the values ​​when they print it. – Benjamin Nov 04 '20 at 09:08

2 Answers2

3

Try this:

import tensorflow as tf

def inv_softmax(x, C):
   return tf.math.log(x) + C

import math
input = tf.keras.layers.Input(shape=(1,10))
x = tf.keras.layers.Lambda(lambda x : inv_softmax(x, math.log(10.)),name='inv_softmax')(input)
model = tf.keras.Model(inputs=input, outputs=x)

a = tf.zeros([1, 1, 10])
a = tf.nn.softmax(a)
a = model(a)
print(a.numpy())
Andrey
  • 5,932
  • 3
  • 17
  • 35
  • I can invert softmax if I store the xi when they go into softmax in my convolution network.(that's not what I did thanks for the remark I'll change it) My xi are always different from 0 and 1. But the problem stay the same to implement my funct into my model, I do something wrong I think. Si=exp(xi)/SUM(exp(xi)) <=> xi = ln(Si) + ln(SUM(exp(xi)) ln(SUM(exp(xi)) is a constat that I compute when I'm making softmax in my conv model. – Benjamin Nov 04 '20 at 08:49
  • My tensorflow has no attribute math :( – Benjamin Nov 04 '20 at 09:38
  • What is tf version ? – Andrey Nov 04 '20 at 09:44
  • v0.12.0rcl I can't change it – Benjamin Nov 04 '20 at 09:45
  • try np.log(). In my tf (ver. 2.3.1) it is impossible – Andrey Nov 04 '20 at 09:56
  • I can put K.log from keras.backend ? it work to compile the model with K.log, np.log didn't work – Benjamin Nov 04 '20 at 09:59
1

Thanks it works ! I put :

import keras.backend as K

def inv_softmax(x,C):
   return K.log(x)+K.log(C)
Benjamin
  • 33
  • 1
  • 5