$25
1. Pose Estimation [1]
A calibration block with one circular pocket on its top face Unit:cm 10 is shown in Figure 1(a). The object coordinate system
XoYoZo is defined at one of its lower corners and the axes 10
are aligned with the three orthogonal edges of the block as shown in Fig 1(b). An image of the component is captured
by a camera located at an unknown position and unknown
orientation. The image is similar to Fig. 1(b)
Fig 1(a)
Fig 1(b)
f [cm]
dx’ [cm]
dy’ [cm]
Cx [pixel]
Cy [pixel]
Ncx
Ncy Ncz
0.79
6 10 −4
6 10 −4
−
− −
a. Use Hough Transform to identify the 9 straight edges in ‘block.png’. You may use MATLAB built-in HT algorithm. Find the 9 normal vectors n=(af,bf,c)T/|(af,bf,c)T| as defined in [1]. (Note: for equation of line au+bv+c=0 as defined in [1] on 2D image plane, (u,v) are the physical coordinates measured from the center of the image plane, hence if you obtain coefficients associated with pixel coordinates, make sure you do the appropriate conversion to get the real ‘a’, ‘b’ and ‘c’)
b. From circle ellipse correspondence, the normal vector from the circle center is found in [Ncx Ncy Ncz]T. Along with the results obtained from 1a, carry out pose estimation that utilizes line correspondence and orthogonal constraint. Show value of [R] and T.
Suggested functions: hough.m, houghpeaks.m, houghlines.m. Nonimal value of Tz is 120 cm
2. Artifical Neural Network (ANN) based on back-propagation (BP) learning Design a BP-ANN to recognize O and T on a binary 44 square grid.
a. Define your ANN structure by specifying the number of inputs, hidden layers, layer nodes, and outputs. Provide a schematic of your chosen ANN structure.
b. Derive the weight update rule for your ANN, assuming a unipolar sigmoid function for each processing element.
c. Train with the data in Fig 2(a). Show the convergence curve (mean squared error vs. number of epoch). Save the weights of nodes in a “.mat” file.
d. Test the data in Fig 2(b) by reading the weights of nodes in your “.mat” file. Show the output values and results.
3. Color
I. Artificial Color Contrast (ACC): Consider two color samples [225, 88, 96], [149, 135, 134] respectively representing target and noise in Fig 3, which are to be spatially separated in color space using DoG as discussed in class so that classification can be more easily performed:
h x yi ( , ) = Gc f j (x y, ) −Gs fk (x y, )
where f j (x y, ) with (j=1, 2, 3) corresponds to RGB component
images respectively; and fk(x y, )are some linear combinations of Fig 3. Chicken RGB component images to be designed.
a. Derive the following equations with fk(x y, ) equal to + −(R G) and + + −(R G B):
h x y1( , ) =DoG R G + s G h x ya( , ) =DoG R G + s −[B G] h x y2( , ) =DoG G G + s [2G R− ] h x yb( , ) =DoG G G + s −[B R] h x y3( , ) =DoG B G + s − −[B R G( )] h x yc( , ) =DoG B G + s [2B R G− +( )]
b. Perform the ACC transformation (σc=1, σs=10) on sample color patterns (100×100 each) with the following combinations: 1-2-3, 1-2-c, 1-b-3, 1-b-c, a-2-3, a-2-c, a-b-3, a-b-c
II. Color-based Image Segmentation: Color is important information. Same objects commonly have their domain color. L-a-b color system is the color-opponent space with the L lightness dimension and a-b coloropponent dimensions. The color-based image segmentation can be performed by applying the clustering method on the points in a-b domain with the post process. Follow the steps to segment the target of the RGB image ‘Chicken.jpg’.
Step 1. Transfer pixels from RGB to Lab color system.
Step 2. Apply k-means clustering on data in a-b domain with cluster number (k=3).
Step 3. Erode the segment image to filter out small fragments.
III. Principle component analysis (PCA): Use the RGB image ‘Chicken.jpg’ for the following. a. Determine the covariance matrix of data.
b. Derive the components (eigenvectors) with eigenvalues arranged in a descending order.
c. Obtain the maximum and minimum values of three component matrices. Show these three matrices (images) with linear mapping from the minimum and maximum values to the range of (0-255).