0

Trying to create a Bayesian neural network using DenseVariational layer in Tensorflow probability. My question is when we set the parameter kl_use_exact to False does by that we are not taking the prior function into consideration. I tried to look at the source code of the DenseVariational class (_make_kl_divergence_penalty function) and I'm more confused then before, I didn't get the use of kl_use_exact.

Lah Cen
  • 27
  • 1
  • 8

1 Answers1

1

kl_use_exact specifies the calculation method of KL-Divergence between variational posterior and prior.

False: KL will be approximated.

True: Uses the actual KL value. However it can be set True if and only if the KL-Divergence between the distributions are registered in TensorFlow-Probability.

Frightera
  • 4,773
  • 2
  • 13
  • 28