1

I have constructed a regression type of neural net (NN) with dropout by Tensorflow. I would like to know if it is possible to find which hidden units are dropped from the previous layer in the output file. Therefore, we could implement the NN results by C++ or Matlab.

The following is an example of Tensorflow model. There are three hidden layer with one output layer. After the 3rd sigmoid layer, there is a dropout with probability equal to 0.9. I would like to know if it is possible to know which hidden units in the 3rd sigmoid layer are dropped.

def multilayer_perceptron(_x, _weights, _biases):
    layer_1 = tf.nn.sigmoid(tf.add(tf.matmul(_x, _weights['h1']), _biases['b1']))
    layer_2 = tf.nn.sigmoid(tf.add(tf.matmul(layer_1, _weights['h2']), _biases['b2']))
    layer_3 = tf.nn.sigmoid(tf.add(tf.matmul(layer_2, _weights['h3']), _biases['b3']))
    layer_d = tf.nn.dropout(layer_3, 0.9)
return tf.matmul(layer_d, _weights['out']) + _biases['out']

Thank you very much!

Olivier Moindrot
  • 27,908
  • 11
  • 92
  • 91
world2005
  • 171
  • 1
  • 12
  • the Answer is "Yes it is possible, and Yes it is definitely possible in Matlab, but it is also possible in Python". before TensorFlow, most of DNN were simulated on Matlab, you could probably get some code that will run, but the problem is that you probably will not be able to replicate TensorFlow - therefore your simulations might give you difference results then running with TensorFlow. – GameOfThrows May 26 '16 at 14:56

2 Answers2

6

There is a way to get the mask of 0 and 1, and of shape layer_3.get_shape() produced by tf.nn.dropout().

The trick is to give a name to your dropout operation:

layer_d = tf.nn.dropout(layer_3, 0.9, name='my_dropout')

Then you can get the wanted mask through the TensorFlow graph:

graph = tf.get_default_graph()
mask = graph.get_tensor_by_name('my_dropout/Floor:0')

The tensor mask will be of same shape and type as layer_d, and will only have values 0 or 1. 0 corresponds to the dropped neurons.

Olivier Moindrot
  • 27,908
  • 11
  • 92
  • 91
  • Thank you for the quick reply. I try your code and I would like to know how to print 'mask'. I try `print mask` but it gives me `Tensor("my_dropout/Floor:0", shape=(?, ?), dtype=float32)`. – world2005 May 26 '16 at 15:30
  • You have to use `sess = tf.Session()` and `sess.run(mask)` to get its value. **Warning**, the value will be different at each call because of randomization. – Olivier Moindrot May 26 '16 at 15:41
  • When I use `sess.run(mask)`, it requires me to input value for `feed_dict`. May I know what I should input here? – world2005 May 26 '16 at 15:47
  • You should input the values corresponding to the `tf.placeholder()` in your code. I suggest you read some tutorials like [this one](https://www.tensorflow.org/versions/r0.8/tutorials/mnist/pros/index.html) to better understand how TensorFlow works. – Olivier Moindrot May 26 '16 at 15:56
  • Thanks a lot! I misunderstood the definition of dropout in NN. Now the model works perfectly! – world2005 May 26 '16 at 16:16
1

Simple and idiomatic solution (although possibly slightly slower than Oliver's):

# generate mask
mask = tf.nn.dropout(tf.ones_like(layer),rate)

# apply mask
dropped_layer = layer * mask
Benjamin
  • 441
  • 4
  • 7