I have the following function
f(x)∝|x| exp(-1/2 |x| )+1/(1+(x-40)^4 ),xϵR
I want to find out E(X) and E(X^3) through Simpson's method (numerical integration), Standard Monte Carlo approach, Acceptance-rejection sampling, Importance sampling, Metropolis-Hasting Algorithm, Gibbs sampling and then Bayesian model using MCMC (I have not decided yet).
How can I validate my results obtained from different methods? I have tried to solve E(X) mathematically but fail to find any close form. This function can be divided over different parts as
absolute(x)*double exponential density + another function utilizing higher power (4) of X in inverse form. Due to absolute (x) and range [-Inf, Inf] We always have to divide it over [-Inf, 0] and [0, Inf]. Through Integration by parts I was able to see first part as (absolute (x) + (x^2/2) over infinite range) + Integral of this part can't be found mathematically.
So I make use of the following code to get numerical integration result as
Library(stats)
integrand <- function(x) {x*(abs(x)* exp(-0.5*abs(x))+(1/(1+(x-40)^4)))}
integrate(integrand, lower = -Inf, upper = Inf)
thus the result is E(X)= 88.85766 with absolute error < 0.004
The results which I obtain from these methods are not similar for instance
(i) Through Simpsons method I got E(X) = 0.3222642 and E(X^3)=677.0711..
simpson_v2 <- function(fun, a, b, n=100) {
# numerical integral using Simpson's rule
# assume a < b and n is an even positive integer
if (a == -Inf & b == Inf) {
f <- function(t) (fun((1-t)/t) + fun((t-1)/t))/t^2
s <- simpson_v2(f, 0, 1, n)
} else if (a == -Inf & b != Inf) {
f <- function(t) fun(b-(1-t)/t)/t^2
s <- simpson_v2(f, 0, 1, n)
} else if (a != -Inf & b == Inf) {
f <- function(t) fun(a+(1-t)/t)/t^2
s <- simpson_v2(f, 0, 1, n)
} else {
h <- (b-a)/n
x <- seq(a, b, by=h)
y <- fun(x)
y[is.nan(y)]=0
s <- y[1] + y[n+1] + 2*sum(y[seq(2,n,by=2)]) + 4 *sum(y[seq(3,n-1, by=2)])
s <- s*h/3
}
return(s)
}
EX <- function(x) x*(abs(x)* exp(-0.5*abs(x))+(1/(1+(x-40)^4)))
simpson_v2(EX, -Inf, Inf, n=100)
EX3 <- function(x) (x^3)*(abs(x)* exp(-0.5*abs(x))+(1/(1+(x-40)^4)))
simpson_v2(EX3, -Inf, Inf, n=100)
(ii) Importance Sampling My proposal density is Normal with mean=0 and standard deviation =4. The summary of the Importance sampling process I am applying is as follows
Suppose I can't sample from f(x) which is true as it has no well-known form and no built-in function is available in R to use for sampling. So, I propose another log cancave tail distribution N(0, 4) to take samples such that instead of estimating E(x) I estimate E(x*f(x)/N(0,1)). I use the following code for this which takes 100000 samples from N(0,4)
X <- rnorm(1e5, sd=4)
Y <- (X)*(abs(X)*exp(-0.5*abs(X))+(1/(1+(x-40)^4)))/(dnorm(X, sd=4))
mean(Y)
Since this code needs random sampling from Normal distribution therefore, each time I got different answers but it is something around -0.1710694 which is almost similar to 0.3222642. I got from Simpsons method. But these results are very different E(X)= 88.85766 from integrate(). Note that integrate() use the Adaptive quadrature method. Is this method different from Simpsons and Importance sampling? What similarity in results I should expect while comparing these methods