Starting from:

$30

PGM -Assignment 1 - Solved

Exercises for Probabilistic Graphical Models

Sheet No.1





1           Probabilities
Points: 8
In this excercise, you will prove some basic, but very important rules in probability theory.

1.    For any two events E1 and E2, prove

                                               p(E1 ∪ E2) = p(E1) + p(E2) − p(E1 ∩ E2)                              (1)

what if E1 and E2 are two disjoint events?

2.    (Bayes’ law) Given the Kolmogorov definition for conditional probabilities

                                                                                                           (2)

derive Bayes’ law:

                                                                                                    (3)

3.    (Law of total probability) Let E1, ..., En be mutually disjoint events in the probability space Ω such that Ω = . Then for any event B in the same space Ω show that

                                         )                         (4)

4.    (Linearity of expectation) For any finite collection of discrete random variables X1,...,Xn with finite expectations, show that

                                                                    n                           n

                                                                     E[XXi] = XE[Xi]                                                    (5)

                                                                   i=1                       i=1

5.    Let X, Y , Z be three disjoint subsets of random variables. We say X and Y are conditionally independent given Z if and only if

                                                   pX,Y |Z(x,y | z) = pX|Z(x | z)pY |Z(y | z)                                  (6)

Show that X and Y are conditionally independent given Z if and only if the joint distribution for the three subsets of random variables factors in the following form:

                                                            pX,Y,Z(x,y,z) = h(x,z)g(y,z)                                          (7)

(Be careful to prove both directions!)

2           Complexity analysis
Points: 6
Consider the three random variables X,Y,Z all of which are binary.

•    How many states do you need in general to fully specify the joint distribution p(x,y,z)?

•    How many states are needed if the distribution does factorize in p(x,y,z) = p(x | y)p(y | z)p(z)?

•    How many states do you need, if the variables are not binary but can take values in {1,2,...,N}; consider both previous cases.

•    How many states do you need to specify a distribution over all 8bit grayscale images of size 1000 × 1000 pixels? There are random variables x1,x2,...,x1M with xi ∈ {0,...,255} for i = 1,...M.

•    Do you have an idea of how to represent the distribution more compactly?

Provide the number of states needed by your method.

3           Chest Clinic Network
 

The chest clinic network above concerns the diagnosis of lung disease (tuberculosis,lung cancer, or both, or neiter). In this model a visit to asia is assumed to increase the probability of lung cancer. We have the following binary variables.

 xpositive X-ray dDyspnea (shortness of breath) eEither Tuberculosis or Lung Cancer tTuberculosis lLung cancer bBronchitis aVisited Asia sSmoker

1.    (Points: 1) Write down the factorization of the distribution implied by the graph.

2.    (Points: 4) Are the following independence statements implied by the graph? (And how do you conclude this?)

(a)    tuberculosis⊥⊥smoking|shortness of breath

(b)    tuberculosis⊥⊥smoking|bronchitis

(c)    lung cancer⊥⊥bronchitis|smoking

(d)   visit to Asia⊥⊥smoking|lung cancer

(e)    visit to Asia⊥⊥smoking|lung cancer,shortness of breath

3.    (Bonus Points: 3) Calculate by hand the values for p(d). The Conditional Probability Table (CPT) is:

p(a = 1)
=
0.01,
p(s = 1)
=
0.5
p(t = 1 | a = 1)
=
0.05,
p(t = 1 | a = 0)
=
0.01
p(l = 1 | s = 1)
=
0.1,
p(l = 1 | s = 0)
=
0.01
p(b = 1 | s = 1)
=
0.6,
p(b = 1 | s = 0)
=
0.3
p(x = 1 | e = 1)
=
0.98,
p(x = 1 | e = 0)
=
0.05
p(d = 1 | e = 1,b = 1)
=
0.9,
p(d = 1 | e = 1,b = 0)
=
0.7
p(d = 1 | e = 0,b = 1)
=
0.8,
p(d = 1 | e = 0,b = 0)
=
0.1
 and

= 0,

.

More products