TY - GEN
T1 - Localization based on multiple visual-metric maps
AU - Sujiwo, Adi
AU - Takeuchi, Eijiro
AU - Morales, Luis Yoichi
AU - Akai, Naoki
AU - Ninomiya, Yoshiki
AU - Edahiro, Masato
PY - 2017/12/7
Y1 - 2017/12/7
N2 - This paper presents a fusion of monocular camera-based metric localization, IMU and odometry in dynamic environments of public roads. We build multiple vision-based maps and use them at the same time in localization phase. For the mapping phase, visual maps are built by employing ORB-SLAM and accurate metric positioning from LiDAR-based NDT scan matching. This external positioning is utilized to correct for scale drift inherent in all vision-based SLAM methods. Next in the localization phase, these embedded positions are used to estimate the vehicle pose in metric global coordinates using solely monocular camera. Furthermore, to increase system robustness we also proposed utilization of multiple maps and sensor fusion with odometry and IMU using particle filter method. Experimental testing were performed through public road environment as far as 170 km at different times of day to evaluate and compare localization results of vision-only, GNSS and sensor fusion methods. The results show that sensor fusion method offers lower average errors than GNSS and better coverage than vision-only one.
AB - This paper presents a fusion of monocular camera-based metric localization, IMU and odometry in dynamic environments of public roads. We build multiple vision-based maps and use them at the same time in localization phase. For the mapping phase, visual maps are built by employing ORB-SLAM and accurate metric positioning from LiDAR-based NDT scan matching. This external positioning is utilized to correct for scale drift inherent in all vision-based SLAM methods. Next in the localization phase, these embedded positions are used to estimate the vehicle pose in metric global coordinates using solely monocular camera. Furthermore, to increase system robustness we also proposed utilization of multiple maps and sensor fusion with odometry and IMU using particle filter method. Experimental testing were performed through public road environment as far as 170 km at different times of day to evaluate and compare localization results of vision-only, GNSS and sensor fusion methods. The results show that sensor fusion method offers lower average errors than GNSS and better coverage than vision-only one.
KW - Robot Vision Systems
KW - Simultaneous Localization and Mapping
UR - http://www.scopus.com/inward/record.url?scp=85042364358&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85042364358&partnerID=8YFLogxK
U2 - 10.1109/MFI.2017.8170431
DO - 10.1109/MFI.2017.8170431
M3 - Conference contribution
AN - SCOPUS:85042364358
T3 - IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems
SP - 212
EP - 219
BT - MFI 2017 - 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 13th IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 2017
Y2 - 16 November 2017 through 18 November 2017
ER -