4

I hardly know a thing about floating issues in tensorflow. But I need to calculate entropy of my network output (which is a logit). The best thing that I could come up with is

entropy = tf.nn.softmax_cross_entropy_with_logits_v2(
labels=tf.nn.softmax(out_net),
logits=out_net
)

I wonder if there is any way to completely avoid the usage of tf.nn.softmax because I believe it may causes some floating issue.

What would be a best way? What formulas do you guys use?

le4m
  • 558
  • 7
  • 18
  • See https://stackoverflow.com/questions/44674847/cross-entropy-jungle/44684178#44684178 – Sharky Apr 11 '19 at 13:17
  • You can use a custom method whih takes in logits and labels and returns the entropy. In this method you can use `tf` methods to calculate softmax ( by its mathematical formula ) and then calculate the entropy. – Shubham Panchal Apr 12 '19 at 03:02

0 Answers0