$24.99
1. Conceptual questions [20 + 5 points].
1. (5 points) Please prove the first principle component direction v corresponds to the largest eigenvector of the sample covariance matrix:
v = arg max .
2. (5 points) Based on your answer to the question above, explain how to further find the second largest principle component directions.
,
respectively. Please show the work for your derivations in full detail.
4. (5 points) Explain the three key ideas in ISOMAP (for manifold learning and non-linear dimensionality reduction).
5. (Bonus 5 points) How outliers affect the performance of PCA? You can create numerical examples to study and show this.
2. PCA: Food consumption in European countries [20 points].
The data food-consumption.csv contains 16 countries in Europe and their consumption for 20 food items, such as tea, jam, coffee, yogurt, and others. We will perform principal component analysis to explore the data. In this question, please implement PCA by writing your own code (you can use any basic packages, such as numerical linear algebra, reading data, in your file).
First, we will perform PCA analysis on the data by treating each country’s food consumption as their “feature” vectors. In other words, we will find weight vectors to combine 20 food-item consumptions for each country.
(a) (10 points) For this problem of performing PCA on countries by treating each country’s food consumption as their “feature” vectors, explain how the data matrix is set-up in this case (e.g., the columns and the rows of the matrix correspond to what). Now extract the first two principal components for each data point (thus, this means we will represent each data point using a two-dimensional vector). Draw a scatter plot of two-dimensional representations of the countries using their two principal components. Mark the countries on the plot (you can do this by hand if you want). Please explain any pattern you observe in the scatter plot.
(b) (10 points) Now, we will perform PCA analysis on the data by treating country consumptions as“feature” vectors for each food item. In other words, we will now find weight vectors to combine country consumptions for each food item to perform PCA another way. Project data to obtain their two principle components (thus, again each data point – for each food item – can be represented using a two-dimensional vector). Draw a scatter plot of food items. Mark the food items on the plot (you can do this by hand if you want). Please explain any pattern you observe in the scatter plot.
3. Order of faces using ISOMAP [25 points]
(c) (10 points) Perform PCA (you can now use your implementation written in Question 1) on the imagesand project them into the top 2 principal components. Again show them on a scatter plot. Explain whether or you see a more meaningful projection using ISOMAP than PCA.
4. Eigenfaces and simple face recognition [25 points].
This question is a simplified illustration of using PCA for face recognition. We will use a subset of data from the famous Yale Face dataset.
Remark: You will have to perform downsampling of the image by a factor of 4 to turn them into a lower resolution image as a preprocessing (e.g., reduce a picture of size 16-by-16 to 4-by-4). In this question, you can implement your own code or call packages.
First, given a set of images for each person, we generate the eigenface using these images. You will treat one picture from the same person as one data point for that person. Note that you will first vectorize each image, which was originally a matrix. Thus, the data matrix (for each person) is a matrix; each row is a vectorized picture. You will find weight vectors to combine the pictures to extract different “eigenfaces” that correspond to that person’s pictures’ first few principal components.
(a) (10 points) Perform analysis on the Yale face dataset for Subject 1 and Subject 2, respectively, usingall the images EXCEPT for the two pictures named subject01-test.gif and subject02-test.gif. Plot the first 6 eigenfaces for each subject. When visualizing, please reshape the eigenvectors into proper images. Please explain can you see any patterns in the top 6 eigenfaces?
(b) (10 points) Now we will perform a simple face recognition task.
Face recognition through PCA is proceeded as follows. Given the test image subject01-test.gif and subject02-test.gif, first downsize by a factor of 4 (as before), and vectorize each image. Take the top eigenfaces of Subject 1 and Subject 2, respectively. Then we calculate the projection residual of the 2 vectorized test images with the vectorized eigenfaces:
sij = ∥(test image)j − (eigenfacei)(eigenface)Ti (test image)
Report all four scores: sij, i = 1,2, j = 1,2. Explain how to recognize the faces of the test images using these scores.
(c) (5 points) Comment if your face recognition algorithm works well and discuss how you would like toimprove it if possible.
5. To subtract or not to subtract, that is the question [10 points]. In PCA, we have to subtract the mean to form the covariance matrix
before finding the weight vectors, where . For instance, we let
Cw1 = λ1w1
where λ1 is the largest eigenvalue of C, and w1 is the corresponding largest eigenvector.
Now suppose Prof. X insisting not subtracting the mean, and uses the eigenvectors of
to form the weight vectors. For instance, she lets ˜w1 to be such that
C˜w˜1 = λ˜1w˜1
where λ˜1 is the largest eigenvalue of C˜.
Now the question is, are they the same (with and without subtract the mean)? Is w1 equal or not equal to ˜w1? Use mathematical argument to justify your answer.