$30
Problem 1
Chapter 2, Exercise 2 (p. 52).
Problem 2
Chapter 2, Exercise 3 (p. 52).
Problem 3
Chapter 2, Exercise 7 (p. 53).
Problem 4 (4 points)
Chapter 10, Exercise 1 (p. 413).
Problem 5
Chapter 10, Exercise 2 (p. 413).
Problem 6
Chapter 10, Exercise 4 (p. 414).
Problem 7
Chapter 10, Exercise 9 (p. 416).
Problem 8
Chapter 3, Exercise 4 (p. 120).
Problem 9
Chapter 3, Exercise 9 (p. 122). In parts (e) and (f), you need only try a few interactions and transformations.
1
Problem 10
Chapter 3, Exercise 14 (p. 125).
Problem 11
Let x1,...,xn be a fixed set of input points and , where with and
. Prove that the MSE of a regression estimate fˆfit to (x1,y1),...,(xn,yn) for a random test point decomposes into variance, square bias, and irreducible error components.
Hint: You can apply the bias-variance decomposition proved in class.
Problem 12
Consider the regression through the origin model (i.e. with no intercept):
(1)
(a) (1 point) Find the least squares estimate for β.
(b) (2 points) Assume such that and Var . Find the standard error of the estimate.
(c) (2 points) Find conditions that guarantee that the estimator is consistent. n.b. An estimator βˆn of a parameter β is consistent if βˆ→p β, i.e. if the estimator converges to the parameter value in probability.