$20
Problem 1.
Prove that, ifindependent.π(π, π) = π(π)π(π) for some function π on πonly, and πon πonly, then π and π are
πdon’t know how to start a proof, it’s a good indicator that the proof is simple. Just follow thedefinitions.Hint(π): Useand ππ((ππ),, use the definition of marginalπ) = π(π)π(π) to prove independence,probability. The rule of the thumb is that, when wewhich is simply the definition. To obtain
Problem 2.
Prove that the expectation is a linear system.
πΌπ∼π(π)[π π(π) + π π(π)] = π πΌπ∼π(π)[π(π)] + π πΌπ∼π(π)[π(π)]
You may treat π as a discrete variable.
Hint: Again, use the definition πΌπ
Problems 3 and 4 concern the following setting.
Letbe a continuous random variable uniformly distributed in the interval, where are unknown parameters.
We have a dataset, where each data sample is iid drawn from the above distribution, and we would like to estimate the parameters and .
Problem 3.
(a) Give the likelihood of parameters.
(b) Give the maximum likelihood estimation of parameters.
Problem 4.
(c) Prove that MLE is biased in this case.
(d) Prove that MLE is asymptotically unbiased if .
Hint: A parameter estimation being biased is said in terms of the current dataset , although we imagine that could be repeatedly drawn from a repeatable trial. In other words, we do not assume goes to infinity in (c). Being asymptotically unbiased means that, if goes to infinity, the bias would become smaller and converge to 0.