2

I've created a simple convolution neuron network with TensorFlow. When I use input images with edge = 32px the network works fine, but if I increase edge twice to 64px then entropy retutrs as NaN. The question is how to fix that?

CNN structure is pretty simple and looks like: input->conv->pool2->conv->pool2->conv->pool2->fc->softmax

entropy calculates like:

prediction = tf.nn.softmax(tf.matmul(h_fc1_drop, W_fc2) + b_fc2)
cross_entropy = tf.reduce_mean(-tf.reduce_sum(ys * tf.log(prediction), reduction_indices=[1]))       # loss
train_step = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
train_pred = tf.equal(tf.argmax(prediction, 1), tf.argmax(ys, 1))
train_accuracy = tf.reduce_mean(tf.cast(train_pred, tf.float32))

for 64px I have:

train_accuracy=0.09000000357627869, cross_entropy=nan, test_accuracy=0.1428571492433548
train_accuracy=0.2800000011920929, cross_entropy=nan, test_accuracy=0.1428571492433548
train_accuracy=0.27000001072883606, cross_entropy=nan, test_accuracy=0.1428571492433548

for 32px it looks fine and the training gives result:

train_accuracy=0.07999999821186066, cross_entropy=20.63970184326172, test_accuracy=0.15000000596046448
train_accuracy=0.18000000715255737, cross_entropy=15.00744342803955, test_accuracy=0.1428571492433548
train_accuracy=0.18000000715255737, cross_entropy=12.469900131225586, test_accuracy=0.13571429252624512
train_accuracy=0.23000000417232513, cross_entropy=10.289153099060059, test_accuracy=0.11428571492433548
Verych
  • 431
  • 1
  • 5
  • 12

1 Answers1

2

As far as I know, NAN happen when you calculate log(0). I had the same problem.

tf.log(prediction) #This is a problem when the predicted value is 0.

You can avoid this by adding a little noise to the prediction (related 1, related 2).

tf.log(prediction + 1e-10)

Or use the clip_by_value function from tensorflow, it defines a min and max value for the passed tensor. Something like this (Documentation):

tf.log(tf.clip_by_value(prediction, 1e-10,1.0))

Hope it helps.

Community
  • 1
  • 1
Will Glück
  • 1,242
  • 12
  • 17