TY - GEN
T1 - 3D reconstruction by a mobile robot using multi-baseline omni-directional motion stereo based on GPS/DR compound navigation system
AU - Meguro, Jun Ichi
AU - Takiguchi, Jun Ichi
AU - Amano, Yoshiharu
AU - Hashizume, Takumi
PY - 2007
Y1 - 2007
N2 - In this paper, a unique dense 3D shape reconstruction method featuring GPS/DR coupled omni-direction multi-baseline motion stereo system is presented. The epipolar plane equation between two Omni Directional Vision images is computed from the GPS/DR's position and posture information. Then, the precise epipolar lines can be obtained robustly by projecting the intersection line between the epipolar plane and the ODVs image plane. The robust matching method featuring hybrid use of the future based matching and the area based matching, as well as the bi-directional matching which chose common mutual matching points to improve robustness is also presented. Voting process using multi-baseline is used to reduce distance error of motion stereo. Hundreds of dense range images are unified based on the precise position/posture information to generate the successive dense 3D outdoor model. The range estimation accuracy within 10 [m] area is 140 [mm], which is equal to a laser radar. It can be said that the proposed omni-directional stereo vision has robustness toward environmental complication and accurate distance estimation performance for rich textured object.
AB - In this paper, a unique dense 3D shape reconstruction method featuring GPS/DR coupled omni-direction multi-baseline motion stereo system is presented. The epipolar plane equation between two Omni Directional Vision images is computed from the GPS/DR's position and posture information. Then, the precise epipolar lines can be obtained robustly by projecting the intersection line between the epipolar plane and the ODVs image plane. The robust matching method featuring hybrid use of the future based matching and the area based matching, as well as the bi-directional matching which chose common mutual matching points to improve robustness is also presented. Voting process using multi-baseline is used to reduce distance error of motion stereo. Hundreds of dense range images are unified based on the precise position/posture information to generate the successive dense 3D outdoor model. The range estimation accuracy within 10 [m] area is 140 [mm], which is equal to a laser radar. It can be said that the proposed omni-directional stereo vision has robustness toward environmental complication and accurate distance estimation performance for rich textured object.
UR - http://www.scopus.com/inward/record.url?scp=43049167064&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=43049167064&partnerID=8YFLogxK
U2 - 10.1109/CCA.2006.286165
DO - 10.1109/CCA.2006.286165
M3 - Conference contribution
AN - SCOPUS:43049167064
SN - 0780397959
SN - 9780780397958
T3 - Proceedings of the IEEE International Conference on Control Applications
SP - 1807
EP - 1812
BT - Proceedings of the 2006 IEEE International Conference on Control Applications
T2 - Joint 2006 IEEE Conference on Control Applications (CCA), Computer-Aided Control Systems Design Symposium (CACSD) and International Symposium on Intelligent Control (ISIC)
Y2 - 4 October 2006 through 6 October 2006
ER -