Starting from:

$25

CS596 -Midterm Exam -Solved

Foundations of Computer and Data Science 

Problem 1: Let V denote the space of all polynomials p(x) of order up to some fixed integer value n.

a) Show that V is a vector space. Specify the addition and multiplication. b) Is V finite dimensional? If yes what is its dimension? c) Define a straightforward basis. d) Define at least three linear subspaces of V.

e) If p(x) = p0 + p1x + p2x2 + ··· + pnxn there is a one-to-one correspondence between p(x) and the vector [p0 p1 ···pn]| of its coefficients. Using this correspondence define an inner product for V and then use it to define a norm for polynomials of order up to n.

Problem 2: If Q is a real symmetric matrix of dimensions k × k, with eigenvalues ρ1 ≥ ρ2 ≥ ··· ≥ ρk, which are real, then we recall that we have already proved that for any real vector X we have

 .

a) Using the special eigen-decomposition of real symmetric matrices, extend the previous inequalities to complex vectors X as follows

 ,

where X∗ denotes the conjugate of X. b) If A is a square matrix of dimensions k × k with real elements, denote with λ1,...,λk its eigenvalues that may be complex numbers (and the corresponding eigenvectors complex vectors) and with σ1 ≥ σ2 ≥ ··· ≥ σk its singular values which are real and nonnegative. Using question a) show that all eigenvalues λi satisfy

σ1 ≥ |λi| ≥ σk.

Hint: The σi2 are the eigenvalues of the symmetric matrix A|A.

Problem 3: A square matrix P is called a projection if P2 = P. a) Show that the eigenvalues of P are either 0 or 1. b) Show that if P is a projection so is I − P where I the identity matrix. c) If P is also symmetric P| = P then P is called an orthogonal projection. Prove that for an orthogonal projection P and any vector X we have that X − PX and PX are orthogonal. d) If the two matrices A,B have the same dimensions m × n then show that P = A(B|A)−1B| is a projection matrix. What is the condition on the dimensions m,n and on the product B|A for this P to be well defined ? When is this matrix an orthogonal projection? e) If b is a fixed vector of length m and ˆb some arbitrary vector, we are interested in minimizing the square distance minˆb kb−ˆbk2 where k·k is the Euclidean norm. To avoid the trivial solution we constrain

ˆb to satisfy ˆb = AX where A is a matrix of dimensions m × n with m > n and X an arbitrary vector of length n. Show that the optimum ˆb is the orthogonal projection of b with some proper projection matrix which you must identify.

Problem 4: As discussed in the class, the space of all random variables constitutes a vector space. We can also define an inner product (also mentioned in class) between two random variables x,y using the expectation of the product

<x,y>= E[xy].

Consider now the random variables x,z,w. We are interested in linear combinations of the form ˆx = az + bw where a,b are real deterministic quantities. a) By using the orthogonality principle find the ˆx∗ (equivalently the optimum coefficients a∗,b∗) that is closest to x in the sense of the norm induced by the inner product. b) Compute the optimum (minimum) distance and its optimum approximation ˆx∗ in terms of E[xz],E[xw],E[z2],E[zw],E[w2].

More products