Starting from:

$25

COEN240-Homework 6 Solved

Problem 1 You are given a face image database of 10 subjects. Each subject has 10 images of 112 × 92 pixels. Convert each image to a vector of length D=112 × 92 = 10304. 

1.1 Apply the principal-component analysis (PCA) method to the data set for face feature extraction. Use different rank values d=1,2,3,6,10,20, and 30 (i.e. find d principal components). Project the face images to the rank-d subspace (i.e. project the face images onto the d principal components) and apply the nearest-neighbor classifier in the projection space. Plot the recognition accuracy rate (𝑛𝑢𝑚𝑏𝑒𝑟  𝑜𝑓 𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑐𝑙𝑎𝑠𝑠𝑖𝑓𝑖𝑐𝑎𝑡𝑖𝑜𝑛 %) versus different d

𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑡𝑒𝑠𝑡 𝑐𝑎𝑠𝑒𝑠 values. 

1.2 Use the Fisher’s Linear Discriminant (FLD) method to find the projection directions to reduce the face image dimension, followed by a nearest-neighbor classifier to perform face recognition. Before applying FLD, you would first use PCA to reduce the dimensionality of face images to 𝑑0 = 40. The final reduced dimension of the images is 𝑑 = [1,2,3,6,10,20,30].  

 

Use the PCA and LinearDiscriminantAnalysis libraries from sklearn. Mean subtraction is not needed. The following code snippet is for your reference:

 

from sklearn.decomposition import PCA from sklearn.discriminant_analysis import LinearDiscriminantAnalysis pca0 = PCA(n_components=d0) 

pca0_operator = pca0.fit(L) # L is the training data set, each row is one training image

L0 = pca0_operator.transform(L) # reduced-dim of the data: rows are data points

 

# input of lda is the reduced-dim data from pca: 

lda = LinearDiscriminantAnalysis(n_components=d) # FLD /LDA lda_operator = lda.fit(filled by yourself) train_proj_lda = lda_operator.transform(filled by yourself).transpose() # columns are examples Run 20 independent experiments. In each experiment, randomly choose 8 images per class to form the training set, and the remaining images form the test set. Plot the recognition accuracy rate

𝑡ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑐𝑙𝑎𝑠𝑠𝑖𝑓𝑖𝑐𝑎𝑡𝑖𝑜𝑛

 % versus different d values for both PCA and FLD methods on the same figure

𝑡ℎ𝑒 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑡𝑒𝑠𝑡 𝑐𝑎𝑠𝑒𝑠

with different colors, show the legends for both curves, and show the xlabel and ylabel on the figure. Comment on the results.

More products