$25
Exercise 1
A bivariate Gibbs sampler for a vector π₯ = (π₯1, π₯2) draws iteratively from the conditional distributions in the following way:
choose a starting value π(π₯1|π₯(20)) for each iteration π:
draw π₯(21(ππ)) from from ππ((π₯π₯21||π₯π₯(1(2ππ−))1)) draw π₯
As in other MCMC algorithms, samples from the first few iterations are usually discarded. This is known as the "warmup" phase. Suppose that the conditional distributions are
π₯1|π₯2 ∼ ξΊ
2 ,
π₯2|π₯1 ∼ ξΊ
where π, π1, π2, π1, π2 are real valued parameters and |π| < 1.
1. (code) Implement a Gibbs sampler which takes as inputs the number of warmup draws warmup , the number of iterations iters and the parameters rho, mu1, mu2, sigma1, sigma2 .
2. (code) Set π = 0.3, π1 = 0.5, π2 = 0.1, π1 = 1, π2 = 1.5 and plot the distributions of π₯1 and π₯2.
3. (theory) What is the joint distribution of π₯ = (π₯1, π₯2)?
Exercise 2
(code)
Let π1 and π2 be real valued parameters of the function
π(π₯, π1, π2) = ππ1π₯π₯+ π2 .
We observe some noisy realizations (π₯πππ , π¦πππ ) of this function and want to infer the posterior distribution of π1 and π2.
1. Generate 30 observations (π₯πππ , π¦πππ ) as follows:
π1 = 0.5
π2 = 3
π ∼ ξΊ(0, 0.32)
π₯πππ ∼ ξ(0, 10)
π¦πππ = π(π₯πππ , π1, π2) + π
and plot them using seaborn sns.scatterplot function.
2. Given the observations from (1.) and the generative model
π1 ∼ Exp(2.)
π2 ∼ Exp(0.2)
πΎ ∼ ξ(0, 0.5)
π¦Μ = π(π₯, π1, π2)
π¦ ∼ ξΊ(π¦Μ , πΎ)
use pyro NUTS to infer the posterior distribution of π1, π2, πΎ. Set warmup_steps=2000 and num_samples=100 .
3. Look at convergence checks on the traces and discuss the quality of your estimates. Remeber that you can extract posterior samples from an MCMC object using get_samples() method (notebook 04).
4. Use sns.scatterplot to plot the observed couples (π₯πππ , π¦πππ ) and sns.lineplot to plot the learned function π with the posterior estimates π1, π2.