$25
(Lange Exercise 7.6) Find by hand the Cholesky decomposition of the matrix
.
(Lange Exercise 7.8) Suppose the matrix A = {aij} is banded in the sense that aij = 0 when |i − j| > d. Prove that the Cholesky decomposition B = {bij} also satisfies the band condition bij = 0 when |i − j| > d.
Small Bonus: what can we tell about sparse matrices in general?
(Lange Exercise 7.11) If X = QR is the QR decomposition of X, where X has linearly independent columns, then show that the projection matrix
X(XTX)−1XT = QQT.
In addition, show that |det(X)| = |det(R)| when X is square and in general that det(XTX) = [det(R)]2.
(Lange Exercise 8.4, modified) Show that the reflection matrix
is orthogonal and find its eigenvalues and eigenvectors.
(Lange Exercise 8.5, modified) Suppose λ is an eigenvalue of the orthogonal matrix O with corresponding eigenvector v. Show that if v has real valued entries, then λ = ±1. Hint: use properties of the norms of orthogonal matrices.
(Lange Exercise 9.3, modified) It can be shown that the matrix norm induced by the Euclidean (L2) norm of matrix A is equal to the largest singular value of this matrix. Now, let A be an invertible m×m matrix with singular values σ1,·· ,σm. Recall that the L2 condition number is defined as cond2(A) = ||A||2||A−1||2. Prove that
cond.
Simulation of multivariate normal random vectors.Write an R function that takes as input an n dimensional numeric vector µ and a n×n positive definite matrix Σ and returns N realizations from the multivariate normal distribution MVN(µ,Σ), using Cholesky decomposition.
Document this function and add it to your package
Create a test case with n = 4 and N = 100 and use sample mean and sample covariance matrices to (somewhat informally) validate your function.
Download the file csv. The file contains simulated data with a response vector and 5 covariates, including a dummy one for the intercept. In the tasks below, pay attention to the order of operations, so that your computations are performed efficiently.Obtain OLS estimates of the regression coefficients using a QR decomposition(implement in a function, document, and add to the package)
Obtain OLS estimates of the regression coefficients using the SVD decomposition(implement in a function, document, and add to the package)
Benchmark computational efficiency of both implementations and comment onyour results
Hint: you may find it helpful to look at R snippets when working on the last two problems:
https://r-snippets.readthedocs.io/en/latest/la/index.html