Starting from:

$29.99

CS1.502 Assignment 1 Solution


INSTRUCTIONS
• Discussions with other students are not discouraged. However, all write-ups must be done individually with your own solutions.
• Be clear and precise in your writing.

Problem 1. (a) Let X and Y be two discrete random variables jointly distributed according to PXY , and g(X,Y ) is a function of X and Y . Show that
E[g(X,Y )] = X g(x,y)PXY (x,y).
x∈X,y∈Y
(b) Use part (a) to prove the linearity of expectation, i.e., E[X + Y ] = E[X] + E[Y ].
(1 mark)
Problem 2. Consider two jointly distributed discrete random variables X and Y .
(a) Prove that H(Y |X) = 0 if and only if Y is a deterministic function of X, i.e., for all x with PX(x) > 0, there is only one possible value of y such that PXY (x,y) > 0.
(b) Let g(X) be a function of the random variable X. Show that H(g(X)) ≤ H(X).
(c) Show that
H(X|Y ) ≤ H(X) with equality if and only if X and Y are independent
using their respective definitions, i.e., without using the non-negativity of mutual information I(X;Y ). [Hint. Prove that f : R+ → R defined by f(t) = tlogt is strictly convex and use it.]
(d) Give an example of a probability distribution PXY for which there exists a y ∈ Y such that H(X|Y = y) > H(X).
Problem 3. (a) If X − Y − Z forms a Markov chain, show that I(X;Y |Z) ≤ I(X;Y ).
(b) Let X and Y be independent fair binary random variables taking values in {0,1}, and let Z = X+Y . Show that I(X;Y |Z) > I(X;Y ).
(c) If X − Y − Z and X − Z − Y form Markov chains, show that I(X;Y ) = I(X;Z).
(1 mark)
Problem 4. Let X1,X2,...,Xn, and Y are jointly distributed discrete random variables. (a) Prove the chain rule for conditional entropy, i.e.,
n
H(X1,X2,...,Xn|Y ) = XH(Xi|X1,...,Xi−1,Y ).
i=1
[Hint. Prove H(X1,X2|Y ) = H(X1|Y ) + H(X2|X1,Y ) and use mathematical induction.]
n
I(X1,X2,...,Xn;Y ) = XI(Xi;Y |X1,...,Xi−1).
i=1

More products