$25
Model Matching (10 pts): This problem examines the detection of faces by model matching. A face is modeled as a collection of circles and circular arcs, as so:
The basic idea is that face detection begins by finding circles and circular arcs in the image, followed by matching against a stored model. Be sure to give short justifications for each answer below.
How should the model be represented?
How should images be processed to detect the features?
How should matching be performed?
Is your answer to c. invariant to translation, rotation, and scale?
Can it handle partial occlusion? If so, how? If not, suggest an extension to your scheme that can.
Interpretation Tree (5 pts): We have a choice of matching detected image elements (edges) to the model or model elements to the object. Let E be the set of detected image edges and M the set of model edges. In the first case, matching image edge to model edges, we generate a tree of depth |𝐸𝐸| and breadth |𝑀𝑀| with tree size |𝑀𝑀||𝐸𝐸|. In the case of matching the model to image, we generate a tree of size |𝐸𝐸||𝑀𝑀|. We expect many more image elements than model elements – there may be many candidate image edges in a cluttered scene vs. a small number of model edges.Which approach is preferable, matching image edges to model or model to image edges? You might consider the case where there are 12 image edges and 5 model edges, for example.
One advantage to using the interpretation tree approach is that it is possible to match an unknown object in the image to a model even if the object is partially occluded.
We do this by allowing an object element to match a “null element” in the model. Does this change your answer to part a.? How and why? Or why not?
Binary Image Matching (2 pts): Let 𝐼𝐼1 and 𝐼𝐼2 be binary images. Show that
|𝐼𝐼1 − 𝐼𝐼2|2 = # of pixels where 𝐼𝐼1 ≠ 𝐼𝐼2 Where |𝐼𝐼|2 = ∑𝑖𝑖𝑗𝑗𝑗𝑗2 is the sum of all (pixels squared) in I.
Classification (5 pts): Suppose we have a 2-class classification problem with class means 𝜇𝜇⃗𝐴𝐴 and 𝜇𝜇⃗𝐵𝐵. Assuming that both classes are equally likely, show that the Nearest Mean classifier decision boundary is the hyper-plane perpendicular to, and midway along, the line segment connecting 𝜇𝜇⃗𝐴𝐴 to 𝜇𝜇⃗𝐵𝐵. You do not need to assume any particular distribution for classes A and B.