NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
IRIS: High-fidelity Perception Sensor Modeling for Closed-Loop Planetary SimulationsPerception plays a key role in autonomous and semi-autonomous planetary exploration vehicles. For instance, landers can use computer vision techniques for identifying safe landing locations, aerial vehicles use cameras as navigation sensors, and planetary rovers use them for localization and hazard detection. Engineering simulations of such systems requires the accurate modeling of perception and vision sensors for simulating autonomy scenarios. In addition, the modeling of sensors for landers, aerial and ground vehicles requires the ability to handle large and high-resolution terrains, the accurate modeling of illumination, hi-fidelity rendering via ray/path tracing and the inclusion of sensor characteristics. Vision sensor models strive to simulate sensor reality by using physics principles to model the interaction of light and objects. Furthermore, high frame rate performance is highly desirable for in-the-loop simulations involving vehicle dynamics and control software. In this paper we describe a new sensor modeling capability called Inter-planetary Rendering for Imaging and Sensors (IRIS) that meets these requirements for the real-time and high-fidelity simulation of vision sensors for planetary aerospace and robotics applications.
Document ID
20230005732
Acquisition Source
Jet Propulsion Laboratory
Document Type
Preprint (Draft being sent to journal)
External Source(s)
Authors
Elmquist, Asher
Young, Aaron
Jain, Abhinandan
Gaut, Aaron
Aiazzi, Carolina
Date Acquired
January 3, 2022
Publication Date
January 3, 2022
Publication Information
Publisher: Pasadena, CA: Jet Propulsion Laboratory, National Aeronautics and Space Administration, 2022
Distribution Limits
Public
Copyright
Other
Technical Review

Available Downloads

There are no available downloads for this record.
No Preview Available