The Investigation Process Research Resource Site
A Pro Bono site with hundreds of resources for Investigation Investigators
Home Page Site Guidance FAQs Old News Site inputs Forums
. . . .. . . . . . . . last updated 3/23/07

INVESTIGATING
INVESTIGATIONS

to advance the
State-of-the-Art of
investigations, through
investigation process
research.



Research Resources:

Search site for::


Launched Aug 26 1996.

 

 
BOOK REVIEW


Reviewed by Ludwig Benner, Jr., WO2202

  Posted 1 April 97

The Challenger Launch Decision Risky Technology, Culture and Deviance at NASA by Diane Vaughan. University of Chicago Press; Chicago, IL, 1996. ISBN: 0-226-85275-3


Published in ISASI forum 29:2, June 1996
Copyright 1996 by the International Society of Air Safety Investigators
Reproduced by permission of ISASI
Permission to preprint material in this work, without fee, is hereby granted contingent on giving full and appropriate credit to the author and to the International Society of Air Safety Investigators on any reprints. For permission to reproduce and disseminate the material in other forms, please contact ISASI at its Sterling, VA USA office, at ISASI Technology Trading Park Five Export Drive
Sterling VA 20164-4421 USA


Telephone: (703) 43O9668 FAX: (703) 45O1745


Two recent air transport crashes again focus public and professional scrutiny on human factors in aviation. Both occurred within organizations with reputations for high safety sensitivity. Why weren't the relevant human behaviors involved in these accidents modified before these accidents happened? This question has been asked continually for decades by many within and outside the aviation community. Professor Vaughan's book exposes investigation failures, using a famous accident to provide new insights into changes in investigation methodologies needed to address this question.

Vaughan's vehicle for finding the investigation failures is the National Air and Space Administration's Challenger shuttle launch accident on January 28, 1986. . Ten years ago NASAL was recognized as one of the most highly safety-sensitive organizations in the world. Its prominence for leadership in risk assessment and operational safety judgment was severely damaged by the Challenger launch accident. NASA itself, a special U.S. House of Representatives subcommittee, and a Special Commission appointed by the President of the United States each investigated the accident, creating a massive documentary record from which emerged the generally accepted explanation of the accident: it was caused by a combination of production pressures and managerial wrongdoing. Challenger was remembered as a technical failure to which the management of both NASA and its contractor, Morton Thiokol, contributed. The House committee's findings blamed individuals, suggesting that they were unqualified for their positions and authority, implying strongly that the launch decision arose out of managerial incompetence.

Vaughan initially approached the Challenger accident as a promising example of organizational misconduct, an area of her scholarly expertise. As she tried to understand why NASA managers, who had all the information prior to the launch and were specifically warned against it, made the launch decision anyway, she discovered that the investigative records could not answer the question. The question was not asked during the prior investigations. Neither were questions about NASA's economic constraints, accepted deviations from internal technical rules, checkered decision-making involving the controversial Solid Rocket Booster, and other organizational and management issues which contributed to the fateful launch decision.

Vaughan therefore had to reopen the investigations to obtain new data directly from the primary sources to understand the launch decision. She conducted interviews with the individuals who participated in the original "stream of decisions". By situating their actions and decisions into their chronological and cultural context, Vaughan identifies "...an incremental descent into poor judgment" which refund the traditional blame assigned to managerial wrongdoing. Her startling conclusion:


It was not amorally calculating managers violating rules that were responsible for the tragedy. It was conformity. (p 386 - Emphasis added)

Vaughan's description of the relationships among management and organizational decisions and actions, and their cultural context, with their disastrous consequences, have special significance to investigators. She demonstrates specifically how earlier investigators' failure to "ask the right questions" and their conventional wisdom led to erroneous interpretations of their data; how attempts to use Fault Tree methods led to perceptions of cover-ups; and how misunderstandings of organizational language and culture led investigators to allege "violations". The "Politics of Blame" turned investigators' attention and media spotlights to operator error.

Vaughan's research provides valuable insights into decision-making mechanisms which can help investigators looking into organizational and managerial influences on accident processes - if they want to use them. It was NASA managers' uncritical acceptance of deviance from established expectations and limits, and their ultimate normalization of deviations which led inexorably to approval of the launch. Ironically, these same limitations turned out to lack scientific basis. They were extracted from irrelevant applications data and untested. As one NASA decision maker later testified, "I was referencing a non existent data base."

Vaughan's insistence on searching out primary data sources helped her avoid typical investigators' Retrospective Fallacy, wherein "retrospection corrects history, altering the past to make it consistent with the present, implying that errors should have been anticipated." It is easy to judge actions as deviant after the outcome is known, when they were considered normal by the participants at the time. (Vaughan's approach also deals effectively with Reason's caution in Human Error, that "...the retrospective observer should be aware of the beam of hindsight bias in his own [eye].")

Among the sobering findings of Vaughan's re-investigation is the discovery that many aspects of the behavior of the Solid Rocket Booster (SRB) joint were either not known, or not recognized, prior to the Challenger launch. When confronted with this apparent anomaly in supposedly scientific decision making, one key manager responded "...it is difficult for me to answer this way, but I was not smart enough to know it before hand." Investigation of the SRB system, its operation and behavior during launch were crucial to Vaughan's findings.

What are the lessons for investigators? The following insights are particularly significant for improving the efficacy of investigative results in a quest for preventive initiatives:

  1. Go to the primary sources involved in decisions to find the out what happened, and what institutional forces are expressed and taken into account
  2. Look for, and identify direct connections, or links, among actions and decisions by the primary sources, and how those actions and decision were programmed by operational, technical, financial, and other managerial influences - both from within the parent organization and from influential outside organizations; then construct "decision stream" contexts within which to position decisions and actions in a chronological setting during the accident process, and by which development of the parent organization's "worldview" may be vividly portrayed.
  3. Explore incomplete or erroneous system definition, knowledge of system operation and system performance for their role in decisions made during accidents.
  4. Make sure you select a methodology that will enable you to discover, find and ask the right questions of those who influenced the decisions.


Vaughan's comments on the subject of Causation are also noteworthy. They ought to inspire new thinking in the air safety investigation community:

All causal explanations have implications for control. The benefit of explanations that locate the immediate cause of organizational failure in individual decision makers is that quick remedies are possible. Responsible individuals can be fired, transferred or retired. New rules that regulate decision making can be instituted. Having made these changes, the slate is clean.The myth of managerial wrongdoing made the strategy for control straightforward: fix the technology and change the managerial cast of characters, implement decision controls, and proceed.... (p 392)


Recommendation: MUST READING for all investigators claiming or desiring competence in investigating behavior and decision making in accidents.



<