I want to use the tensorflow built-in cross-entropy function. However, in the documentation, I'm reading
Do not call this op with the output of softmax, as it will produce incorrect results.
https://www.tensorflow.org/api_docs/python/tf/nn/softmax_cross_entropy_with_logits
Like it is done often, I am using the softmax activation in my last output layer, however:
result = tf.layers.dense(input=dropout, classes_num, tf.nn.softmax)
Is it, therefore, incorrect to use this function, or is the documentation incorrect? I don't understand this, I would be thankful for a short explanation. (Which TensorFlow cost function would be correct to use for a softmax output layer then?)