-1

I've implemented a CNN for image classification using some tutoriels on the net, I found this function of softmax, and I didn't understand it

score = tf.nn.softmax(predictions[0])

when I use it I found values that I didn't understand their meaning

can anyone explain this function and it's used for what please!

  • 1
    Does this answer your question? [What are logits? What is the difference between softmax and softmax\_cross\_entropy\_with\_logits?](https://stackoverflow.com/questions/34240703/what-are-logits-what-is-the-difference-between-softmax-and-softmax-cross-entrop) – yudhiesh Jul 08 '21 at 14:32

1 Answers1

0

Well a softmax function is there to map your logits to a percentage, typically used in multi class classification problems, the percentage will sum up to be 1

The function can be calculated as such

e.g [1. 0. 1.] -> [0.3, 0.4, 0.3]
softmax = tf.exp(logits) / tf.reduce_sum(tf.exp(logits), axis) # Took from the official tensorflow site

Typical Usage

import tensorflow as tf

logits = model(data) # make a forward pass(prediction)
predictions = tf.nn.softmax(logits) # so we know the score of our predictions
print(predictions.max())
Edwin Cheong
  • 879
  • 2
  • 7
  • 12