Starting from:

$25

STATS230- Homework 4 Solved

1.    Let X ∼ beta(4,2). Compute the second of moment of X, E(X2), using Monte Carlo (not MCMC), with 1,000 realizations from this distribution. Compute the Monte Carlo error and 95% confidence interval of you approximation.

2.    Consider a Markov chain, on a countable state-space E, that given its current state i proceeds by randomly drawing another state j with proposal probability qij and then accepting/rejecting this proposed state with probability

                                                                                          ,                                                     (1)

where π = (π1,π2,...) is a probability mass function on E. This MCMC construction is called Barker’s algorithm. Notice that 0 ≤ aij ≤ 1 by definition.

(a)     What are the transition probabilities of this Markov chain, pij, for ?

(b)    Show that π is a stationary distribution of Barker’s Markov chain.

3.    Let’s return to the beta distribution example. Let X ∼ beta(4,2). Use a MetropolisHastings algorithm and a multiplicative log-normal proposal to approximate E(X2). More specifically, given xcur, we generate xprop = xcureX, where X ∼N(0,σ2).

(a)     What is the corresponding proposal density? Recall that if lnY ∼N(µ,σ2), then

Y has a log-normal distribution with density

 .

(b)    What is the Metropolis-Hastings ratio corresponding to the above proposal?

(c)     In addition to the E(X2) approximation, provide a trace plot and a histogram of the samples.

1

More products