$25
Foundations of Computer and Data Science
Problem 1: Let χ1,χ2 be random variables which 99% of the time are independent and Normally (Gaussian) distributed, both with mean 0 and variance 1 and 1% of the time they are independent and Normally distributed both with mean 0 and variance σ2 6= 1. a) Compute the joint pdf of the two random variables.
b) Examine if the two random variables are independent. c) Give an example of two random variables that are uncorrelated but not independent.
Problem 2: Let χ,ζ be random variables that are related through the equality
ζ = |χ + s|.
a) If the pdf of χ is fχ(x) compute the pdf of ζ when s is a deterministic quantity. b) Repeat the previous question when s is a random variable independent from χ and takes only the two values 0 and 1 with probability 0.2 and 0.8 respectively. c) Under the assumptions of question b) compute the posterior probability P(s = 0|ζ = z). Hint: For the computation of the pdf of a random variable the simplest way is to start with the computation of the cdf and then take the derivative. For b) use total probability.
Problem 3: You are given an unfair coin where every time you throw it the probability to observe “head” is p ∈ (0,1) (and “tail” 1 − p) where p is unknown. Assume that you throw the coin N times and you report the sequence of heads (H) and tails (T). If Nh is the number of heads you observed and N − Nh the number of tails then: a) Find the probability of the specific sequence you have observed as a function of p,N,Nh. b) Compute the maximum likelihood estimator (MLE) of p if you are given such a sequence. Is the result familiar/expected? c) Compute the average of your estimate and the mean square error from the true unknown value p. What do you conclude about your estimate as N → ∞?
Problem 4: Consider the random data {x1,...,xN} and assume that they are Markov related as follows
xn = αxn−1 + wn, n = 2,...,N,
where |α| < 1, {wn} are independent and identically distributed Gaussian random variables with mean 0 and variance 1 and independent from x1. Random variable x1 is also Gaussian with mean 0 and variance . a) Using the fact that linear combinations of Gaussians is also Gaussian show that all xn are Gaussian.
b) Find the joint probability density function of {x1,...,xN} assuming α is given. c) Find the maximum likelihood estimator (MLE) of parameter α. Hint: For b) use the fact that {x1,x2 − αx1,...,xN − αxN−1} are independent Gaussians.
Problem 5: Assume that you have two hypotheses H0,H1. Under Hypothesis H0 your observation vector X = [x1,...,xN]T is Gaussian with mean vector µ0 and covariance matrix Σ0, while under H1 it is again Gaussian with mean µ1 and covariance matrix Σ1. Assume that the two hypotheses are equiprobable (P(H0) = P(H1) = 0.5). a) Find the Likelihood ratio test (LRT) and its equivalent form by taking the logarithm on both sides of the inequality. b) What is the latter form reduced to when the two covariance matrices are equal (Σ0 = Σ1)? c) What if additionally the two means are equal? Does this make sense?