$25
3D reconstruction for 3 cases: full calibration, essential matrix, fundamental matrix
Write a python notebook to create a stereo image pair of a simple object described by vertices (e.g., a simple house) in world coordinates. The right camera and left camera should have different views and the object should be in the field of view for both views. Draw each of these views (using Line2D to draw the 2D lines between the projected vertices. Do not use any 3d plotting commands. ) Write a python notebook to reconstruct the 3D object using the three methods below.
1. Assume full calibration so the right and left camera matrices are know.
2. Assume only the intrinsic parameters are known (i.e. the matrix K is known). Youshould assume that you are given corresponding point pairs from the right and left images. This part of the assignment takes as input the matrix K and the corresponding point pairs is pixel coordinates.
3. Assume only the corresponding point pairs are known. This part of the assignmentstakes these point pairs as input.
For each of the three reconstructions, plot the reconstructed object using 3D plotting commands. Comment on the recovered shape in each of the three cases. In particular, discuss the ambiguity remaining in the approach. Hint: Lecture06 reconstructionexample.zip will provide much of the needed information, but in matlab
Note that there are methods to upgrade a projective reconstruction to a metric reconstruction, the projective ambiguity is not a major problem in modern applications. For further reading, see the following papers: “Modeling the World from Internet Photo Collections”, Snavely et al IJCV 2008; “Visual Modeling with a handheld camera”, Pollefeys et al IJCV 2004; “DTAM: Dense tracking and mapping in real-time”, . Newcombe et al ICCV 2011; “ORB-SLAM: A Versatile and Accurate Monocular SLAM System”. Mur-Artal et all, IEEE Transactions on robotics 2015; “ORB-SLAM3: An Accurate Open-Source Library for Visual”, Visual-Inertial and Multi-Map SLAM”, Campos et al IEEE Transactions on Robotics 2021;
1