Adaptive Multi-Sensor Fusion Based Object Tracking for Autonomous Urban Air Mobility OperationsAutonomous operations are a crucial aspect in the context of Urban Air Mobility and other emerging aviation markets. In order to enable this autonomy, systems must be able to build independently an accurate and detailed understanding of the own vehicle state as well as the surrounding environment, this includes detecting and avoiding moving objects in the sky, which can be cooperative (aircraft, UAM vehicles, etc.) as well as noncooperative (smaller drones, birds, ...). This paper focuses on the object tracking part that relies on adaptive multi-sensor fusion, taking into account specific properties and limitations of different sensor types. Results show the impact of dropouts of individual sensors on the accuracy of the tracking results for this adaptive sensor fusion approach.
Document ID
20210025369
Acquisition Source
Ames Research Center
Document Type
Conference Paper
Authors
Thomas Lombaerts (Wyle (United States) El Segundo, California, United States)
Kimberlee H Shish (Ames Research Center Mountain View, California, United States)
Gordon Keller (University of California, Santa Cruz Santa Cruz, California, United States)
Vahram Stepanyan (Wyle (United States) El Segundo, California, United States)
Nicholas Cramer (Ames Research Center Mountain View, California, United States)
Corey Ippolito (Ames Research Center Mountain View, California, United States)
Date Acquired
December 2, 2021
Subject Category
Aircraft Stability And Control
Meeting Information
Meeting: AIAA SciTech Forum
Location: San Diego, CA
Country: US
Start Date: January 3, 2022
End Date: January 7, 2022
Sponsors: American Institute of Aeronautics and Astronautics
Funding Number(s)
CONTRACT_GRANT: 80ARC020D0010
Distribution Limits
Public
Copyright
Public Use Permitted.
Technical Review
Single Expert
Keywords
sensor fusionobject trackingurban air mobilityautonomous operations