Trying to create a Bayesian neural network using DenseVariational layer in Tensorflow probability. My question is when we set the parameter kl_use_exact to False does by that we are not taking the prior function into consideration. I tried to look at the source code of the DenseVariational class (_make_kl_divergence_penalty function) and I'm more confused then before, I didn't get the use of kl_use_exact.
Asked
Active
Viewed 95 times
0
-
1`kl_use_exact` specifies the calculation method of KL-Divergence between variational posterior and prior. `False`: Approximate // `True`: Actual value. – Frightera Jun 22 '22 at 22:46
-
Thanks. With False, KL will be approximated by Monte Carlo sampling. Your comment should be an answer. – Lah Cen Jun 23 '22 at 00:17
-
Added it with some details. – Frightera Jun 23 '22 at 07:42
1 Answers
1
kl_use_exact
specifies the calculation method of KL-Divergence between variational posterior and prior.
False
: KL will be approximated.
True
: Uses the actual KL value. However it can be set True
if and only if the KL-Divergence between the distributions are registered in TensorFlow-Probability.

Frightera
- 4,773
- 2
- 13
- 28