NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
Decision Making In A High-Tech World: Automation Bias and CountermeasuresAutomated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and resultant errors. To what extent these effects generalize to performance situations is not yet empirically established. The two studies to be presented represent concurrent efforts, with student and professional pilot samples, to determine the effects of accountability pressures on automation bias and on the verification of the accurate functioning of automated aids. Students (Experiment 1) and commercial pilots (Experiment 2) performed simulated flight tasks using automated aids. In both studies, participants who perceived themselves as accountable for their strategies of interaction with the automation were significantly more likely to verify its correctness, and committed significantly fewer automation-related errors than those who did not report this perception.
Document ID
20020041010
Document Type
Conference Paper
Authors
Mosier, Kathleen L. (San Jose State Univ. CA United States)
Skitka, Linda J. (NASA Ames Research Center Moffett Field, CA United States)
Burdick, Mark R. (Illinois Univ. Chicago, IL United States)
Heers, Susan T. (Monterey Technologies, Inc. Moffett Field, CA United States)
Rosekind, Mark R.
Date Acquired
August 20, 2013
Publication Date
January 1, 1996
Subject Category
Administration and Management
Meeting Information
Presentation at Society for Judgment and Decision Making Conference(Chicago, IL)
Funding Number(s)
CONTRACT_GRANT: NCC2-798
CONTRACT_GRANT: NCC2-837
CONTRACT_GRANT: NAS2-832
Distribution Limits
Public
Copyright
Work of the US Gov. Public Use Permitted.