NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
Presound: UAV Diagnostic System Enabled by Vibration-Based Machine LearningA low-weight, inexpensive small unmanned aerial system (sUAS) that takes off, performs a mission, lands, and safely stows and recharges itself has myriad future applications ranging from agricultural imaging to last-mile package delivery. Likewise, Urban Air Mobility (UAM) systems will enable people to take air taxis from point to point in cities, rapidly moving commuters long distances without concern for road traffic and congestion. Fully electric aviation systems will be cleaner and quieter than ground transport. Cities could eliminate cars and buses, and convert roads to higher capacity bike and pedestrian throughways. Yet, for sUAS as well as UAM, system reliability and assurance is a limiting factor to deploying affordable autonomous flight systems. For this bright future of aviation to be realized, aircraft must be able to autonomously and accurately self-diagnose health issues both before takeoff and during flight. The GreenSight PreSound system is designed to identify defects on aircraft through intelligent analysis of vibration. It accomplishes this by measuring structural vibrations induced by the vehicle’s own propellers, and analyzing that data using a machine learning model that determines whether a defect is present.

The PreSound system is designed to require no human oversight, and to operate across a wide array of vehicles through re-training of the model for each target aircraft. PreSound has been developed and seen limited early success using data collected from the GreenSight Dreamer sUAS, a 5lb quadrotor vehicle designed for aerial imaging applications. The final detection model, trained on data with props spinning at 50% throttle, achieves excellent performance with over 99% average accuracy in detecting blade damage using a single FFT vector input. It demonstrates the ability to generalize to new types of blade damage, correctly classifying a different type of blade damage with 98% accuracy. Full test pulses were classified with 100% accuracy, and in live testing, all sets of data during blade movement were classified accurately with over 95% confidence. When trained on in-flight data, the same model achieves an average accuracy of 85% in distinguishing between undamaged and blade-damaged states in flight. The authors believe that these accuracies show significant potential of this approach to expand unmanned flight safety, with significant potential benefits in accelerating Advanced Aerial Mobility (AAM) and UAM aviation applications.
Document ID
20240005578
Acquisition Source
Langley Research Center
Document Type
Contractor Report (CR)
Authors
Kennan Arlen
(Greensight)
Gwenda Law
(Greensight)
Andrew DeLollis
(Greensight)
J. Gregory McDaniel
(Boston University)
Sheryl Grace
(Boston University)
Date Acquired
May 3, 2024
Publication Date
May 1, 2024
Subject Category
Aeronautics (General)
Funding Number(s)
WBS: 629660.04.81.07.30
Distribution Limits
Public
Copyright
Portions of document may include copyright protected material.
Technical Review
NASA Technical Management
Keywords
UAS
Machine Learning
Convolution Neural Networks
Unmanned Aerial Vehicles
No Preview Available