In this example from PyMC3 (https://docs.pymc.io/notebooks/GLM-hierarchical.html) there is estimation of posterior distribution with 1 dimensional sample (samples from dataset have only one feature).
# Hyperpriors for group nodes
mu_a = pm.Normal('mu_a', mu=0., sigma=100)
sigma_a = pm.HalfNormal('sigma_a', 5.)
mu_b = pm.Normal('mu_b', mu=0., sigma=100)
sigma_b = pm.HalfNormal('sigma_b', 5.)
# Intercept for each county, distributed around group mean mu_a
# Above we just set mu and sd to a fixed value while here we
# plug in a common group distribution for all a and b (which are
# vectors of length n_counties).
a = pm.Normal('a', mu=mu_a, sigma=sigma_a, shape=n_counties)
# Intercept for each county, distributed around group mean mu_a
b = pm.Normal('b', mu=mu_b, sigma=sigma_b, shape=n_counties)
# Model error
eps = pm.HalfCauchy('eps', 5.)
radon_est = a[county_idx] + b[county_idx]*data.floor.values
# Data likelihood
radon_like = pm.Normal('radon_like', mu=radon_est,
sigma=eps, observed=data.log_radon)
How to generalize if my dataset samples had N-dimensional feature? I've tried to implement inner product with
radon_est = a[county_idx] + np.dot(b[county_idx],data)
but it doesn't work