Multi-camera visual SLAM for autonomous navigation of micro aerial vehicles
Visual Odometry
Odometry
DOI:
10.1016/j.robot.2017.03.018
Publication Date:
2017-04-12T22:30:50Z
AUTHORS (4)
ABSTRACT
In this paper, we present a visual simultaneous localization and mapping (SLAM) system which integrates measurements from multiple cameras to achieve robust pose tracking for autonomous navigation of micro aerial vehicles (MAVs) in unknown complex environments. We analyze the iterative optimizations for pose tracking and map refinement of visual SLAM in multi-camera cases. The analysis ensures the soundness and accuracy of each optimization update. A well-known monocular visual SLAM system is extended to utilize two cameras with non-overlapping fields of view (FOVs) in the final implementation. The resulting visual SLAM system enables autonomous navigation of an MAV in complex scenarios. The theory behind this system can easily be extended to multi-camera configurations, when the onboard computational capability allows this. For operations in large-scale environments, we modify the resulting visual SLAM system to be a constant-time robust visual odometry. To form a full visual SLAM system, we further implement an efficient back-end for loop closing. The back-end maintains a keyframe-based global map, which is also used for loop-closure detection. An adaptive-window pose-graph optimization method is proposed to refine keyframe poses of the global map and thus correct pose drift that is inherent in the visual odometry. We demonstrate the efficiency of the proposed visual SLAM system for applications onboard of MAVs in experiments with both autonomous and manual flights. The pose tracking results are compared with ground truth data provided by an external tracking system. A SLAM system integrating measurements from multiple cameras for MAVs is proposed.No overlap in the respective fields of view of the multiple cameras is required.Robust pose-tracking can be achieved in complex environments.Mathematical analysis on the iterative optimizations in visual SLAM is provided.The efficiency of the proposed visual SLAM system is demonstrated onboard of MAVs.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (66)
CITATIONS (73)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....