And we can implement this algorithm with the Julia code:
function mh_gamma(T = 100, alpha = 1.5)
a = Int(floor(alpha))
b = a/alpha
x = ones(T+1) # initial value: 1
for t = 1:T
yt = rgamma_int(a, b)
rt = (yt / x[t] * exp((x[t] - yt) / alpha))^(alpha-a)
if rt >= 1
x[t+1] = yt
else
u = rand()
if u < rt
x[t+1] = yt
else
x[t+1] = x[t]
end
end
end
return(x)
end
To sample Ga(2.43,1), and estimate Ef(X2)=2.43+2.432=8.33.
# comparison with accept-reject
res = mh_gamma(5000, 2.43)[2:end]
est = cumsum(res.^2) ./ collect(1:5000)
res2 = ones(5000)
for i = 1:5000
res2[i] = rgamma(2.43, 1)
end
est2 = cumsum(res2.^2) ./ collect(1:5000)
using Plots
plot(est, label="Independent MH")
plot!(est2, label="Accept-Reject")
hline!([8.33], label="True value")
Logistic Regression
We observe (xi,yi),i=1,…,n according to the model
Then use these cutpoints to estimate the probability by using samples produced from independent MH algorithm.
Interval
1 - pchisq(x*100, 6*100, 9*2*100)
IMH
0.10
0.1045
0.05
0.0516
0.01
0.0093
the algorithm produces a uniformly ergodic chain (), and the expected acceptance probability associated with the algorithm is at least 1/M when the chain is stationary, and in that sense, the IMH is more efficient than the Accept-Reject algorithm.
Let's illustrate this algorithm with Ga(α,1). We have introduced how to sample from Gamma distribution via Accept-Reject algorithm in , and it is straightforward to get the Gamma Metropolis-Hastings based on the ratio of f/g,
is the .
We can use MLE to estimate the coefficient in the logistical model (see for the derivation), and the following Julia code can help us fit the model quickly.
If K(τ)=log(Eexp(τX)) is the , solving the saddlepoint equation K′(τ)=x yields the saddlepoint. For noncentral chi squared random variable X, the moment generating function is