I hardly know a thing about floating issues in tensorflow. But I need to calculate entropy of my network output (which is a logit). The best thing that I could come up with is
entropy = tf.nn.softmax_cross_entropy_with_logits_v2(
labels=tf.nn.softmax(out_net),
logits=out_net
)
I wonder if there is any way to completely avoid the usage of tf.nn.softmax because I believe it may causes some floating issue.
What would be a best way? What formulas do you guys use?