TY - GEN
T1 - A Modest System of Feature-based Stereo Visual Odometry
AU - Gates, William
AU - Jati, Grafika
AU - Intan P, Riskyana Dewi
AU - Pratama, Mahardhika
AU - Jatmiko, Wisnu
N1 - Funding Information:
ACKNOWLEDGMENT We would like to thank BRIN and Universitas Indonesia grant for supporting this research.
Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Tracking incremental movements and accurately analyzing the environment for localization is one of the challenges of autonomous robots. The approach is to overcome these challenges to estimating the translational and rotational movements using a paired camera by analyzing the captured images, known as Visual Odometry. In this paper, we build a simple feature-based stereo visual odometry system. This system consists of six main parts: detecting features and computing their descriptors using Oriented FAST and Rotated BRIEF (ORB), matching features brute force based on Hamming distance from feature descriptors, tracking features using Lucas-Kanade optical flow, triangulating points features using linear triangulation, estimating translation and rotation by solving Perspective-n-Point (PnP) problems using a combination of Efficient PnP (EPnP) and Random Sample Consensus (RANSAC) methods, and updating position and orientation estimates. Our system has an average translation root mean squared error of 5.1284% and an average rotation error of 0.027 deg/m on the KITTI public odometry dataset with a performance speed of 18.88 frames per second on a 1-core computer environment with a clock speed of 2.7 GHz.
AB - Tracking incremental movements and accurately analyzing the environment for localization is one of the challenges of autonomous robots. The approach is to overcome these challenges to estimating the translational and rotational movements using a paired camera by analyzing the captured images, known as Visual Odometry. In this paper, we build a simple feature-based stereo visual odometry system. This system consists of six main parts: detecting features and computing their descriptors using Oriented FAST and Rotated BRIEF (ORB), matching features brute force based on Hamming distance from feature descriptors, tracking features using Lucas-Kanade optical flow, triangulating points features using linear triangulation, estimating translation and rotation by solving Perspective-n-Point (PnP) problems using a combination of Efficient PnP (EPnP) and Random Sample Consensus (RANSAC) methods, and updating position and orientation estimates. Our system has an average translation root mean squared error of 5.1284% and an average rotation error of 0.027 deg/m on the KITTI public odometry dataset with a performance speed of 18.88 frames per second on a 1-core computer environment with a clock speed of 2.7 GHz.
KW - localization
KW - motion estimation
KW - stereo vision
KW - visual odometry
UR - http://www.scopus.com/inward/record.url?scp=85124354529&partnerID=8YFLogxK
U2 - 10.1109/IWBIS53353.2021.9631865
DO - 10.1109/IWBIS53353.2021.9631865
M3 - Conference contribution
AN - SCOPUS:85124354529
T3 - Proceedings - IWBIS 2021: 6th International Workshop on Big Data and Information Security
SP - 47
EP - 52
BT - Proceedings - IWBIS 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 6th International Workshop on Big Data and Information Security, IWBIS 2021
Y2 - 23 October 2021 through 26 October 2021
ER -