My question is how to use PYMC3 package to carry out conditional probability models.
I have a set of data a_observed, b_observed, c_observed, and I want to find the relations between them. I suspect that a, b, c are all normal distributions, b depends on a, c depends on a, b. I need to find the parameters.
So far I have:
with model:
# define priors
muA = pm.Uniform('muA', lower=0, upper=24)
muB = pm.Uniform('muB', lower=0, upper=24)
muC = pm.Uniform('muC', lower=0, upper=24)
sigmaA = pm.Uniform('sigmaA', lower=0, upper=1000)
sigmaB = pm.Uniform('sigmaB', lower=0, upper=1000)
sigmaC = pm.Uniform('sigmaC', lower=0, upper=1000)
distributionA = pm.Normal('a', mu = muA, sd = sigmaA, observed = a_observed)
distributionB = pm.Normal('b', mu = muB, sd = sigmaB, observed = b_observed)
distributionC = pm.Normal('c', mu = muC, sd = sigmaC, observed = c_observed)
start = pm.find_MAP()
step = pm.Slice()
Now I want A to be independent, B|A, C|A,B. What is the best approach in PYMC3 to carry out this? I've seen lambda functions here http://healthyalgorithms.com/2011/11/23/causal-modeling-in-python-bayesian-networks-in-pymc/, but this approach gives out the conditional probability directly.
Also, I want to know how easy is it to expand the model to more than three variables with more complicated dependencies. Thanks!