Starting from:

$25

PML-Homework 4 Solved

Exercise 1
A bivariate Gibbs sampler for a vector π‘₯ = (π‘₯1, π‘₯2) draws iteratively from the conditional distributions in the following way:

 choose a starting value 𝑝(π‘₯1|π‘₯(20)) for each iteration 𝑖:

draw π‘₯(21(𝑖𝑖)) from  from 𝑝𝑝((π‘₯π‘₯21||π‘₯π‘₯(1(2𝑖𝑖−))1)) draw π‘₯

As in other MCMC algorithms, samples from the first few iterations are usually discarded. This is known as the "warmup" phase. Suppose that the conditional distributions are

π‘₯1|π‘₯2 ∼  

                                                                                                                                                                         2   ,

π‘₯2|π‘₯1 ∼ 

where 𝜌, πœ‡1, πœ‡2, 𝜎1, 𝜎2 are real valued parameters and |𝜌| < 1.

1.  (code) Implement a Gibbs sampler which takes as inputs the number of warmup draws warmup , the number of iterations iters and the parameters rho, mu1, mu2, sigma1, sigma2 .

2.  (code) Set 𝜌 = 0.3, πœ‡1 = 0.5, πœ‡2 = 0.1, 𝜎1 = 1, 𝜎2 = 1.5 and plot the distributions of π‘₯1 and π‘₯2.

3.  (theory) What is the joint distribution of π‘₯ = (π‘₯1, π‘₯2)?

Exercise 2
(code)

Let πœƒ1 and πœƒ2 be real valued parameters of the function

 π‘“(π‘₯, πœƒ1, πœƒ2) = π‘’πœƒ1π‘₯π‘₯+ πœƒ2 .

We observe some noisy realizations (π‘₯π‘œπ‘π‘ , π‘¦π‘œπ‘π‘ ) of this function and want to infer the posterior distribution of πœƒ1 and πœƒ2.

1.  Generate 30 observations (π‘₯π‘œπ‘π‘ , π‘¦π‘œπ‘π‘ ) as follows:

πœƒ1 = 0.5

πœƒ2 = 3

𝜌 ∼ (0, 0.32)

π‘₯π‘œπ‘π‘  ∼ (0, 10)

π‘¦π‘œπ‘π‘  = 𝑓(π‘₯π‘œπ‘π‘ , πœƒ1, πœƒ2) + 𝜌

and plot them using seaborn sns.scatterplot function.

2.  Given the observations from (1.) and the generative model

πœƒ1 ∼ Exp(2.)

πœƒ2 ∼ Exp(0.2)

𝛾 ∼ (0, 0.5)

𝑦̂ = 𝑓(π‘₯, πœƒ1, πœƒ2)

𝑦 ∼ (𝑦̂ , 𝛾)

use pyro NUTS to infer the posterior distribution of πœƒ1, πœƒ2, 𝛾. Set warmup_steps=2000 and num_samples=100 .

3.  Look at convergence checks on the traces and discuss the quality of your estimates. Remeber that you can extract posterior samples from an MCMC object using get_samples() method (notebook 04).

4.  Use sns.scatterplot to plot the observed couples (π‘₯π‘œπ‘π‘ , π‘¦π‘œπ‘π‘ ) and sns.lineplot to plot the learned function 𝑓 with the posterior estimates πœƒ1, πœƒ2.

More products