Real-Time Visual Odometry Covariance Estimation for Unmanned Air Vehicle Navigation

Loading...
Thumbnail Image

Authors

Anderson, Michael
Brink, Kevin
Willis, Andrew

Issue Date

2019-03-04

Type

Article

Language

en_US

Keywords

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

Demand is growing for unmanned air vehicles (UAVs) with greater autonomy, including the ability to navigate without GPS information, such as indoors. In this work, a novel visual odometry algorithm is developed and flight tested. It uses sequential pairs of red, green, blue, depth (RGBD) camera images to estimate the UAV’s change in position (delta pose), which can be used to aid a navigation filter. Unlike existing related techniques, it uses a novel perturbation approach to estimate the uncertainty of the odometry measurement dynamically in real time, a technique that is applicable to a wide range of sensor preprocessing tasks aimed at generating navigation-relevant measurements. Real-time estimates of the delta pose and its covariance allow these estimates to be efficiently fused with other sensors in a navigation filter. Indoor flight testing was performed with motion capture, which demonstrated that the odometry and covariance estimates are accurate when appropriately scaled. Flights also demonstrated the algorithm used in a navigation filter to improve a velocity estimate, which represents a significant improvement over the state of the art for RGBD odometry.

Description

Received 13 August 2018 Accepted 19 December 2018 Published online 4 March 2019

Citation

TY - JOUR T1 - Real-Time Visual Odometry Covariance Estimation for Unmanned Air Vehicle Navigation AU - Anderson, Michael L. AU - Brink, Kevin M. AU - Willis, Andrew R. Y1 - 2019/03/04 PY - 2019 DA - 2019/06/01 N1 - doi: 10.2514/1.G004000 DO - 10.2514/1.G004000 T2 - Journal of Guidance, Control, and Dynamics JF - Journal of Guidance, Control, and Dynamics SP - 1272 EP - 1288 VL - 42 IS - 6 PB - American Institute of Aeronautics and Astronautics N2 - Demand is growing for unmanned air vehicles (UAVs) with greater autonomy, including the ability to navigate without GPS information, such as indoors. In this work, a novel visual odometry algorithm is developed and flight tested. It uses sequential pairs of red, green, blue, depth (RGBD) camera images to estimate the UAV?s change in position (delta pose), which can be used to aid a navigation filter. Unlike existing related techniques, it uses a novel perturbation approach to estimate the uncertainty of the odometry measurement dynamically in real time, a technique that is applicable to a wide range of sensor preprocessing tasks aimed at generating navigation-relevant measurements. Real-time estimates of the delta pose and its covariance allow these estimates to be efficiently fused with other sensors in a navigation filter. Indoor flight testing was performed with motion capture, which demonstrated that the odometry and covariance estimates are accurate when appropriately scaled. Flights also demonstrated the algorithm used in a navigation filter to improve a velocity estimate, which represents a significant improvement over the state of the art for RGBD odometry. AB - Demand is growing for unmanned air vehicles (UAVs) with greater autonomy, including the ability to navigate without GPS information, such as indoors. In this work, a novel visual odometry algorithm is developed and flight tested. It uses sequential pairs of red, green, blue, depth (RGBD) camera images to estimate the UAV?s change in position (delta pose), which can be used to aid a navigation filter. Unlike existing related techniques, it uses a novel perturbation approach to estimate the uncertainty of the odometry measurement dynamically in real time, a technique that is applicable to a wide range of sensor preprocessing tasks aimed at generating navigation-relevant measurements. Real-time estimates of the delta pose and its covariance allow these estimates to be efficiently fused with other sensors in a navigation filter. Indoor flight testing was performed with motion capture, which demonstrated that the odometry and covariance estimates are accurate when appropriately scaled. Flights also demonstrated the algorithm used in a navigation filter to improve a velocity estimate, which represents a significant improvement over the state of the art for RGBD odometry. SN - 0731-5090 M3 - doi: 10.2514/1.G004000 UR - https://doi.org/10.2514/1.G004000 Y2 - 2024/12/06 ER -

Publisher

Journal of Guidance, Control, and Dynamics

License

Journal

Volume

Issue

PubMed ID

DOI

ISSN

EISSN