So I have a Keras model. I want to take the gradient of the model wrt to its inputs. Here's what I do
import tensorflow as tf
from keras.models import Sequential
from keras.layers import Dense
from keras import backend as K
num_features = 5
model = Sequential()
model.add(Dense(60, input_shape=(num_features,), activation='relu'))
model.add(Dense(50, activation='relu'))
model.add(Dense(1, activation='softmax'))
model.compile(optimizer='adam', loss='binary_crossentropy')
#Run predict to initialize weights
model.predict(np.random.rand(1, num_features))
x = tf.random_uniform(shape=(1, num_features))
model_grad = tf.gradients(model(x), x)[0]
However when I print out the value of dmodel_dx I get all 0's.
sess = K.get_session()
print( model_grad.eval(session=sess) )
>>>array([[ 0., 0., 0., 0., 0.]], dtype=float32)
Anyone know what I'm doing wrong?