NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
Learning from Automation Surprises and "Going Sour" Accidents: Progress on Human-Centered AutomationAdvances in technology and new levels of automation on commercial jet transports has had many effects. There have been positive effects from both an economic and a safety point of view. The technology changes on the flight deck also have had reverberating effects on many other aspects of the aviation system and different aspects of human performance. Operational experience, research investigations, incidents, and occasionally accidents have shown that new and sometimes surprising problems have arisen as well. What are these problems with cockpit automation, and what should we learn from them? Do they represent over-automation or human error? Or instead perhaps there is a third possibility - they represent coordination breakdowns between operators and the automation? Are the problems just a series of small independent glitches revealed by specific accidents or near misses? Do these glitches represent a few small areas where there are cracks to be patched in what is otherwise a record of outstanding designs and systems? Or do these problems provide us with evidence about deeper factors that we need to address if we are to maintain and improve aviation safety in a changing world? How do the reverberations of technology change on the flight deck provide insight into generic issues about developing human-centered technologies and systems (Winograd and Woods, 1997)? Based on a series of investigations of pilot interaction with cockpit automation (Sarter and Woods, 1992; 1994; 1995; 1997a, 1997 b), supplemented by surveys, operational experience and incident data from other studies (e.g., Degani et al., 1995; Eldredge et al., 1991; Tenney et al., 1995; Wiener, 1989), we too have found that the problems that surround crew interaction with automation are more than a series of individual glitches. These difficulties are symptoms that indicate deeper patterns and phenomena concerning human-machine cooperation and paths towards disaster. In addition, we find the same kinds of patterns behind results from studies of physician interaction with computer-based systems in critical care medicine (e.g., Moll van Charante et al., 1993; Obradovich and Woods, 1996; Cook and Woods, 1996). Many of the results and implications of this kind of research are synthesized and discussed in two comprehensive volumes, Billings (1996) and Woods et al. (1994). This paper summarizes the pattern that has emerged from our research, related research, incident reports, and accident investigations. It uses this new understanding of why problems arise to point to new investment strategies that can help us deal with the perceived "human error" problem, make automation more of a team player, and maintain and improve safety.
Document ID
19980016965
Acquisition Source
Langley Research Center
Document Type
Contractor Report (CR)
Authors
Woods, David D.
(Ohio State Univ. Columbus, OH United States)
Sarter, Nadine B.
(Illinois Univ. Urbana-Champaign, IL United States)
Date Acquired
September 6, 2013
Publication Date
January 19, 1998
Subject Category
Man/System Technology And Life Support
Report/Patent Number
NAS 1.26:207061
NASA/CR-1998-207061
Funding Number(s)
CONTRACT_GRANT: NCC1-209
CONTRACT_GRANT: NCC2-592
Distribution Limits
Public
Copyright
Work of the US Gov. Public Use Permitted.
No Preview Available