NASA Logo, External Link
Facebook icon, External Link to NASA STI page on Facebook Twitter icon, External Link to NASA STI on Twitter YouTube icon, External Link to NASA STI Channel on YouTube RSS icon, External Link to New NASA STI RSS Feed AddThis share icon
 

Record Details

Record 1 of 1
NeMO-Net: The Neural Multi-Modal Observation and Training Network for Global Coral Reef Assessment
NTRS Full-Text: Click to View  [PDF Size: 209 KB]
Author and Affiliation:
Chirayath, Ved(NASA Ames Research Center, Moffett Field, CA, United States)
Abstract: In the past decade, coral reefs worldwide have experienced unprecedented stresses due to climate change, ocean acidification, and anthropomorphic pressures, instigating massive bleaching and die-off of these fragile and diverse ecosystems. Furthermore, remote sensing of these shallow marine habitats is hindered by ocean wave distortion, refraction and optical attenuation, leading invariably to data products that are often of low resolution and signal-to-noise (SNR) ratio. However, recent advances in UAV and Fluid Lensing technology have allowed us to capture multispectral 3D imagery of these systems at sub-cm scales from above the water surface, giving us an unprecedented view of their growth and decay. Exploiting the fine-scaled features of these datasets, machine learning methods such as MAP, PCA, and SVM can not only accurately classify the living cover and morphology of these reef systems (below 8 percent error), but are also able to map the spectral space between airborne and satellite imagery, augmenting and improving the classification accuracy of previously low-resolution datasets. We are currently implementing NeMO-Net, the first open-source deep convolutional neural network (CNN) and interactive active learning and training software to accurately assess the present and past dynamics of coral reef ecosystems through determination of percent living cover and morphology. NeMO-Net will be built upon the QGIS platform to ingest UAV, airborne and satellite datasets from various sources and sensor capabilities, and through data-fusion determine the coral reef ecosystem makeup globally at unprecedented spatial and temporal scales. To achieve this, we will exploit virtual data augmentation, the use of semi-supervised learning, and active learning through a tablet platform allowing for users to manually train uncertain or difficult to classify datasets. The project will make use of Pythons extensive libraries for machine learning, as well as extending integration to GPU and High-End Computing Capability (HECC) on the Pleiades supercomputing cluster, located at NASA Ames. The project is being supported by NASAs Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST-16) Program.
Publication Date: Dec 11, 2017
Document ID:
20170012136
(Acquired Dec 19, 2017)
Subject Category: EARTH RESOURCES AND REMOTE SENSING
Report/Patent Number: ARC-E-DAA-TN46256
Document Type: Conference Paper
Meeting Information: American Geophysical Union (AGU) 2017 Fall Meeting; 11-15 Dec. 2017; New Orleans, LA; United States
Meeting Sponsor: American Geophysical Union; Washington, DC, United States
Financial Sponsor: NASA Ames Research Center; Moffett Field, CA, United States
Organization Source: NASA Ames Research Center; Moffett Field, CA, United States
Description: 1p; In English
Distribution Limits: Unclassified; Publicly available; Unlimited
Rights: No Copyright; Work of the U.S. Government - Public use permitted
NASA Terms: CLIMATE CHANGE; ECOSYSTEMS; SATELLITE IMAGERY; REMOTE SENSING; OCEAN SURFACE; MULTISENSOR FUSION; CLASSIFICATIONS; EARTH SCIENCES; EDUCATION; MORPHOLOGY; OCEANS
Other Descriptors: NEMO-NET; TRAINING NETWORK; CORAL REEF
Availability Notes: Abstract Only
› Back to Top
Find Similar Records
NASA Logo, External Link
NASA Official: Gerald Steeman
Site Curator: STI Program
Last Modified: December 19, 2017
Contact Us