. . . .. . . . . . . . last updated 8/8/09
The Investigation Process Research Resource Site
A Pro Bono site with hundreds of resources for Investigation Investigators
Home Page Site Guidance FAQs Old News Site inputs Forums

INVESTIGATING
INVESTIGATIONS

to advance the
State-of-the-Art of
investigations, through
investigation process
research.



Research Resources:

Search site for::


Launched Aug 26 1996.

 

Posted 23 Jan 1998

PREVENTING FLIGHT CREW ERRORS: PRIMARY DATA MUST DRIVE ANALYSES


PREVENTING FLIGHT CREW ERRORS: PRIMARY DATA MUST DRIVE ANALYSES

 
Ludwig Benner, Jr., P.E.  Ira J. Rimson, P.E.
Ludwig Benner & Associates  the validata corporation
12101 Toreador Lane  8223 Mosquero NE
Cakton, VA 22124  Albuquerque NM 87109
(703)758-4800  (As of 07/01/96)

Abstract

Two recent major transport crashes again focus public and professional interest on human factors. They occurred within highly safety-sensitive organizations. Why were the relevant human behaviors exhibited by these accidents not identified and changed before these crashes?

The authors draw on research by Professor Diane Vaughan in her book The Challenger Launch Decision, released in February 1996 to commend a new approach to the Office of System Safety's Flight Crew Accident and Incident Human Factors project. Vaughan's work persuasively relates how reliance on findings of prior Presidential and Congressional investigations tainted her early views of the Challenger launch decision. Once Vaughan discovered that the earlier investigations were unable to answer basic questions about the launch decision, she reopened and completed the investigation by working with participants in the launch decision process, and got "turned around" by the additional new decision and action data when it was placed in the context of the time.

The authors conclude that OSS should get investigations of mishaps "turned around" by changing its project goals, approaches and methods, and propose specific actions.

Update

Since the June 1995 FAA Office of System Safety (OSS) Workshop on Flight Crew Accident and Incident Human Factors, two major transport aircraft crashes have focused public and professional attention on human factors. Both the American Airlines Flight 965 enroute to Cali, Columbia, and the U. S. Air Force T-43A (USAF B737) enroute to Dubrovnik, Croatia, occurred within highly safety sensitive organizations, with a reputation for strong commitments to human performance improvement Both mishaps are currently under investigation by experienced investigative authorities. It is not yet clear -It is not clear whether these investigations can or will produce greater insights that previous investigations, whose deficiencies in human factors data development prompted the OSS's initiatives and these workshop.

Abut 10 years ago, the National Aeronautics and Space Administration (NASA) was generally recognized as one of the most highly safety-sensitive organizations in the world. Its reputation for leadership in risk assessment and safety decision making was severely undermined with the Challenger shuttle accident in 1986. The Challenger accident was investigated by both a Presidential Commission and a Congressional Committee. The investigations created a documentary record that became the basis for the historically accepted explanation of this historic event: production pressures and managerial wrongdoing.[1] Challenger is remembered as a technical failure to which the NASA organization contributed. The House Committee's finding blamed individuals, suggesting they were unqualified for their positions, this indicating the possibility that the decision to launch was the result of management incompetence.[2]

A new book was published in February 1996 (Vaughan) that is relevant and instructive for this OSS Workshop. Vaughan's initial review of the public records from the Challenger investigations led her to believe that it resulted from organizational misconduct - an area of specific scholarly interest for her. But as she tried to understand the process that led to the launch decision, she discovered that the data developed by the two investigations were inadequate to answer why the decision to launch was made, a fundamental question. This question, and questions about connections between the economic strain at NASA, rule violations, and decision making about the Solid Rocket Booster hadn't been asked during the investigation. Vaughan discovered organizational and managerial issues, influential on the mental sets of the participants, were similarly ignored during the investigation.

To find out why the launch decision was made, she had to reopen the investigation and complete it herself by conducting additional interviews with the people who participated in the decisions. Using the new data obtained from these original sources, she discovered that her initial impressions and interpretations - and those of the Presidential Commission and House Committee - were wrong. She describes in detail how she reached those conclusions. The discovery that stood the other investigations on their head was:

It was not amorally calculating managers violating rules that were respons-
ible for the tragedy. It was conformity. (Emphasis added, Vaughan, p 386.)

Does this reversal of the conventional wisdom hold a message for the investigation and human factors communities? You bet they do! If investigations under such illustrious authorities as the President and Congress of the United States can accept uncritically the assumptions and mind sets which led to their fallacious conclusions and prevention initiatives, surely we should not expect anything better from common day-today investigative efforts of traditional authorities.

  1. She went directly to the original (primary) data sources to find data needed to finish the investigation, because prior investigations did not pursue it.

  2. She looked for and identified direct connections, or links, between actions and decisions of the primary data sources, and how those actions and decisions were "programmed" by actions and decisions of managerial, technical, financial, public relations or other persons both within the parent organization and within influential outside organizations; and

  3. She used primary source data to construct a "decision stream context" to position the decisions and actions within a time continuum from which development of the organization's worldview" could be vividly portrayed.

     

Once these interactions and their contextual world view were understood, Vaughan was able to identify prevention initiatives. In the NASA/Challenger aftermath these included (1) organizational acceptance of normalized deviations; (2) organizational and managerial cultures of production, and (3) structural secrecy. Vaughan not only identified these problems, she demonstrates how their correction would have prevented the decisions which led to the disastrous outcome.

Lessons learned.

Vaughan's "Lessons Learned" can well be adopted as object lessons for this OSS initiative.

  1. Decision making is a process, which must be examined in the context of a "stream of decisions. "[3].

  2. Only original data sources are capable of describing, unfiltered, what they did and why they did it. Original sourcing helps avoid the "retrospective fallacy"[4] resulting from investigators' being influenced by prior knowledge of untested relevance, such as models, taxonomies, paradigms and other uconventional wisdom" as they structure their investigations and analyses.

  3. Original source data reveal subtle contextual effects of organizational and managerial "worldview" which profoundly influence behavior, yet may be inaccessible by traditional investigative methods. These influences are often generalized throughout an organization, and offer fertile opportunity for preventive initiatives.[5] The conclusion supports previous recommendations that investigative findings must be driven by data obtained from original sources about observed actions and decisions, rather than trying to shoehorn data into preconceived models, taxonomies and paradigms.

  4. As Vaughan observed the data acquired from the participants - the original sources- in this decision, she was able to draw generalized conclusions that have broad applicability. It is possible to discover and postulate prevention strategies from episodic events if source data about what happened and why it happened are properly developed during the investigation. Data about observed actions and decisions provided by the original sources should drive the findings, rather than trying to fit data into predetermined human factors models, taxonomies and categories.[6]

  5. Understanding the culture of the task environment at the time of the incident under investigation, and its influence on participants' decisions. and actions, is imperative for full comprehension of "context."

     

Proposed actions for 0SS

We addressed the need for new investigation paradigms in our paper for the 1995 Workshop.[7] We recommend the following specific positions be adopted by the Workshop.

  1. Adopt the position that an incident or accident is a process during which people and objects interact to produce undesired states or outcomes. By adopting this position, investigators will be compelled to seek reasonably complete data in the context of the stream of actions at the time they occurred, so they can describe that process adequately.

  2. Focus project efforts on discovering and documenting specific crew member or others' actions that, if changed, would reduce risks of future incidents. Oblige investigators to acquire value free data from original sources, by establishing as a desired investigation work product a scientifically verifiable description of what happened during the incident[8], including related contextual interactions which influenced actions and decisions during the incident.

  3. Separate the investigation actions from analysis functions. Investigators should acquire and document the data, chronicle and verify participant actions and decisions, and prepare comprehensive, valid and timely description of the incident process. Analysts use the investigators' descriptions (work products) to identify, define and document specific prevention opportunities, problems, issues or needs demonstrated by the incident.

  4. Adopt investigation and analysis approaches likely to ensure that the context of the decision making and actions is adequately identified, documented and reported, recognizing approaches which avoid the retrospective fallacy.[9]

  5. Resolve to get prevention ideas to improve human performance from episodic incidents or accidents, rather than waiting for a "sufficient" number of incidents to occur to

    identify statistically inferred associations, the ethics of which seem dubious in this area, given the cost of additional accidents in human lives and suffering.

  6. Develop a glossary of terms to describe actors and actions involved in mishaps, so self-reporters can help prepare useful documentation of their episodic experiences when reporting them to data collectors.[10] A glossary offers the additional benefit of providing consistency for subsequent analyses, and for matching patters of event sets across incidents.[11]

     

Summary

The key to reducing Human Error or erroneous actions or decisions is to change Human Behavior or actions or decisions. That requires identifying the behavior or actions and decisions to be changed, and its intended replacements. Undesired behavior can only be identified from original source data about specific actions and decisions, properly documented, and analyzed to recognize effective prevention strategies.

References

Benner, Ludwig Jr.; 10 MES Investigation Guides, Ludwig Benner & Associates, Oakton, VA 1979. 2nd Edition 1994

Benner, Ludwig Jr. & Ira J. Rimson; "Quality Management for Accident Investigators". forum 24:3, October 1991 (Part 1); 25:1, March 1992 (Part 2). International Society of Air Safety Investigators (ISASI), Sterling, VA.

Federal Aviation Administration, Office of System Safety, Proceedings of Workshop on Flight Crew Accident and Incident Human Factors, June 21-23, 1995,

Hendrick, Kingsley & L Benner, Jr.; Investigating Accidents with STEP. New York, Marcel Dekker, 1987.

Hendrick, Kingsley, L Benner & R. Lawton; "A Methodological Approach to the Search for Indirect Human Elements in Accident Investigations." Proceedings of the Fourth International Symposium on Aviation Psychology. (R. S. Jensen, Ed.) The Ohio State University, Columbus; April 29, 1987.

Reason, James T.; Human Error. Cambridge University Press, 1990.

Rimson, Ira J.; "Standards for the Conduct of Aircraft Accident Investigations". forum 23:4, February 1991. ISASI, Sterling, VA.

Vaughan, Diane, THE CHALLENGER LAUNCH DECISION; Risky Technology, Culture and Deviance at MM,. University of Chicago Press, Chicago, IL 1996

Footnotes

[1] See Vaughan, p 8. "Unraveling the history of the decision making.. in its televised hearing, the Commission laid the groundwork for what became the historically accepted explanation of the Challenger launch decision: production pressure and managerial wrongdoing."

[2] Ibid. p 11 for differences between Commission and Committee findings. Neither emphasized technical difficulties in reaching decisions, so public could recognize problems.

[3] Ibid. p 73 about situating controversial actions In the "stream of actions" in which they occurred. See also pp 243-247 for discussion of the decision stream context for examining decisions.

[4] Ibid. p 393. "Retrospection corrects history, altering the past to make it consistent with the present, implying that errors should have been anticipated. " Compare to J. Reason's comments about "falling prey to the fundamental attribution error (blaming people and ignoring situational factors)" and ".the retrospective observer should be aware of the beam of hindsight in his own eye." (Reason) p 216

[5] Ibid. p 393. "Understanding organizational failure depends on ..going beyond secondary sources, relying instead on personal expertise based on original sources that reveal the complexity, the culture of the task environment, and the meanings of actions to insiders at the time."

[6] See 0SS 1995 Workshop Proceedings, Wise and Wise, A-105 for discussion of a priori questions, in the context of conducting the investigation of decision making.

[7] See 0SS 1995 Workshop Proceedings, Benner & Rimson, A-21-22. Vaughan's work shows how they might be implemented.

[8] See Benner & Rimson 1 992 for example of non-statistical validation of a description of what happened, including decisions and actions by decision makers.

[9] See ethnographic research methods (Vaughan) or STEP (Hendrick and Banner) or events overiays (Hendrick, Benner and Lawton) or multilinear events sequencing methods (Benner 1994) The last provides for "progressive" testing of data as data are acquired during an investigation, to determine data still needed, or sufficiency of explanation of what happened

[10] See OSS 1995 Workshop Proceedings, p 7 reporting the need for narrative description of events and access with a narrative search capability.

[11] A personal communication with NASDAC staff disclosed that an obstacle to data integration by NASDAC has been inconsistency of terms across reports processed.