NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
Visual End-Effector Position Error Compensation for Planetary RoboticsThis paper describes a vision-guided manipulation algorithm that improves arm end-effector positioning to subpixel accuracy and meets the highly restrictive imaging and computational constraints of a planetary robotic flight system. Analytical, simulation-based, and experimental analyses of the algorithm's effectiveness and sensitivity to camera and arm model error is presented along with results on several prototype research systems and 'ground-in-the-loop' technology experiments on the Mars Exploration Rover (MER) vehicles. A computationally efficient and robust subpixel end-effector fiducial detector that is instrumental to the algorithm's ability to achieve high accuracy is also described along with its validation results on MER data.
Document ID
20080022183
Acquisition Source
Jet Propulsion Laboratory
Document Type
Reprint (Version printed in journal)
Authors
Bajracharya, Max
(Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
DiCicco, Matthew
(Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
Backes, Paul
(Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
Nickels, Kevin
(Jet Propulsion Lab., California Inst. of Tech. Pasadena, CA, United States)
Date Acquired
August 24, 2013
Publication Date
January 29, 2007
Publication Information
Publication: Journal of Field Robotics
Volume: 24
Issue: 5
Subject Category
Cybernetics, Artificial Intelligence And Robotics
Distribution Limits
Public
Copyright
Other
Keywords
vision-guided manipulation algorithm
vision-guided manipulation

Available Downloads

There are no available downloads for this record.
No Preview Available