Comparison of Visual and LiDAR SLAM Algorithms using NASA Flight Test DataSimultaneous Localization and Mapping (SLAM) is a promising technique that provides localization information and precise mapping of the physical environment without having much prior knowledge of the surroundings. SLAM may have a vital role in aeronautics and aerospace, where vehicles and aircraft must operate in complex environments with traditional localization services that may be degraded or unavailable. This paper compares several pre-canned 3D SLAM algorithms based on vision and LiDAR, namely ORB-SLAM, ORB-SLAM2, LOAM, A-LOAM, and F-LOAM on NASA UAS (Unmanned Aircraft System) flight test data. The NASA ARC UAS flight test demonstrates preliminary SLAM algorithm results, which serve as a stepping stone to simulated AAM (Advanced Air Mobility) concepts. Conducting AFRC UAS flight test for simulated AAM approach and landing with SLAM algorithms provides an Alternative Precision Navigation and Timing solution based on distributed landmarks and fiducials in the landing zone. These algorithms use the telemetry data as ground truth for a baseline comparison. The criteria of the performance comparison include robustness, accuracy, re-localization, response to environmental changes, and real-time effectiveness, which are currently qualitative but to be quantitative in the future.
Document ID
20220018459
Acquisition Source
Ames Research Center
Document Type
Presentation
Authors
Keerthana Kannan (Wyle (United States) El Segundo, California, United States)
Anjan Chakrabarty (Wyle (United States) El Segundo, California, United States)