4

I'm quite new to probabilistic programming and pymc3... Currently, I want to implement the Kennedy-O’Hagan framework in pymc3.

The setup is according to the paper of Kennedy and O'Hagan as follows:

We have n observations zi of the form

zi = f(xi , theta) + g(xi) + ei,

where xi are known imputs and theta are unknown calibration parameters and ei are iid error terms. We also have m model evaluations yj of the form

yj = f(x'j, thetaj), where both x'j (different than xi above) and thetaj are known. Therefore, the data consists of all zi and yj. In the paper, Kennedy-O'Hagan model f, g using gaussian processes:

f ~ GP{m1 (.,.), Sigma1[(.,.),(.,.)] }

g ~ GP{m2 (.), Sigma2[(.),(.)] }

Among other things, the goal is to get posterior samples for the unknow calibration parameters theta.

What I've done so far:

import pymc3 as pm
import numpy as np
import matplotlib.pyplot as plt
from multiprocessing import freeze_support
import sys
import theano
import theano.tensor as tt
from mpl_toolkits.mplot3d import Axes3D
import pyDOE
from scipy.stats.distributions import uniform

def physical_system(x):
  return 0.65 * x / (1 + x / 5)

def observation(x):
  return physical_system(x[:]) + np.random.normal(0,0.01,len(x))

def computational_system(input):
  return input[:,0]*input[:,1]

if __name__ == "__main__":
  freeze_support()
  # observations with noise
  x_obs = np.linspace(0,4,10)
  y_real = physical_system(x_obs[:])
  y_obs = observation(x_obs[:])

  # computation model
  N = 60
  design = pyDOE.lhs(2, samples=N, criterion='center')

  left = [-0.2,-0.2]; right = [4.2,1.2]
  for i in range(2):
    design[:,i] = uniform(loc=left[i],scale=right[i]-left[i]).ppf(design[:,i])

  x_comp = design[:,0][:,None]; t_comp = design[:,1][:,None]
  input_comp = np.hstack((x_comp,t_comp))
  y_comp = computational_system(input_comp)

  x_obs_shared = theano.shared(x_obs[:, None])

  with pm.Model() as model:

    noise = pm.HalfCauchy('noise',beta=5)
    ls_1 = pm.Gamma('ls_1', alpha=1, beta=1, shape=2)
    cov = pm.gp.cov.ExpQuad(2,ls=ls_1)
    f = pm.gp.Marginal(cov_func=cov)
    # train the gp f with data from computer model:
    f_0 = f.marginal_likelihood('f_0', X=input_comp, y=y_comp, noise=noise)
    trace = pm.sample(500, pm.Metropolis(), chains=4)
    burned_trace = trace[300:]

Until here, everything is fine. My GP f is trained according the computer model. Now, I want to test if I could fit this trained GP to my observed data:

   #gp f is now trained to data from computer model
   #now I want to fit this trained gp to observed data and find posterior for theta

   with model:
    sd = pm.Gamma('eta', alpha=1, beta=1)
    theta = pm.Normal('theta', mu=0, sd=sd)
    sigma = pm.Gamma('sigma', alpha=1, beta=1)
    input_1 = tt.concatenate([x_obs_shared, tt.tile(theta, len(x_obs[:,None]), ndim=2).T], axis=1)
    f_1 = gp1.conditional('f_1', Xnew=input_1, shape=(10,))
    y_ = pm.Normal('y_', mu=f_1,sd=sigma, observed=y_obs)
    step = pm.Metropolis()
    trace_ = pm.sample(30000, step,start=pm.find_MAP(), chains=4)

Is this formulation correct? I get very unstable results... The full formulation according KOH should be something like this:

    with pm.Model() as model:
      theta = pm.Normal('theta', mu=0, sd=10)
      noise = pm.HalfCauchy('noise',beta=5)
      ls_1 = pm.Gamma('ls_1', alpha=1, beta=1, shape=2)
      cov = pm.gp.cov.ExpQuad(2,ls=ls_1)
      gp1 = pm.gp.Marginal(cov_func=cov)
      gp2 = pm.gp.Marginal(cov_func=cov)
      gp = gp1 + gp2
      input_1 = tt.concatenate([x_obs_shared, tt.tile(theta, len(x_obs), ndim=2).T], axis=1)
      f_0 = gp1.marginal_likelihood('f_0', X=input_comp, y=y_comp, noise=noise)
      f_1 = gp1.marginal_likelihood('f_1', X=input_1, y=y_obs, noise=noise)
      f = gp.marginal_likelihood('f', X=input_1, y=y_obs, noise=noise)

Could somebody give me some advise how to formulate the KOH properly with pymc3? I am desperate... Would appreciate any help. Thank you!

1 Answers1

0

You might have found the solution but if not, that's a good one (Guidelines for the Bayesian calibration of building energy models)

Lefty
  • 368
  • 4
  • 11