The Investigation Process Research Resource Site
A Pro Bono site with hundreds of resources for Investigation Investigators
Home Page Site Guidance FAQs Old News Site inputs Forums
. . . .. . . . . . . . last updated 8/8/09

INVESTIGATING
INVESTIGATIONS

to advance the
State-of-the-Art of
investigations, through
investigation process
research.



Research Resources:

Search site for::


Launched Aug 26 1996.

   
INVESTIGATION RECOMMENDATION ISSUES

This section contains papers and old and recent comments by experienced investigators, investigation managers and other knowledgeable individuals about safety recommendations flowing from accident and incident investigations.

Downloading, reproduction and dissemination of the contents of this section is encouraged, provided the authors are duly credited in any reproduced material.

Contributions are invited. Contact the Administrator by e-mail at iprr.org, - enter "Recommendations" on the subject line so it doesn't get treated like spam. Better yet, use the AIPRE forum for comments.

Papers for discussion

  1. Frank Taylor: 'Airworthiness requirements: accidents investigation & safety recommendations" (1998)
  2. Ludwig Benner: Ranking Safety Recommendation Effectiveness (1992)

I'll add more papers with recommendation system critiques as I get them.

Discussions of recommendation-related topics posted 10 years ago by Stewart, Rimson, myself and others are left on this page for viewing at the links below.
  • Guidance for comments to this section

  • Junk Science and recommendations

  • Investigation Objectives vs Recommendations

  • Investigation Utility

  • Investigation objectives vs Recommendations

  • Root Cause Analysis and intervention

  • Investigation-based safety management research recommendations

    Section is updated as comments are provided

    Junk science influences on Recommendations.

    Posted 16 Feb. 1998
    Saw an interesting comment about investigations of silicone breast implants that resonated because of problems I have experienced with other kinds of investigations. The whole study was once published at http://x2.dejanews.com/getdoc.xp?AN=323646778.1&CONTEXT=887575039.2105213910&hitnum=7; the title is
    How to rig studies to get the result you want( Ed. note- no longer accessible 3/24/06)

    As you read the extract, consider "accident" researchers who study accident investigation reports, and substitute the interpreters involved in the "investigation" and recommendation development processes for the medical interpreters.

    "Every study published so far to defend the "safety of Breast Implants" is a retroactive study of the records, not based on data collected by the researchers. Have you ever played a parlor game called "telephone?" The game involves lining up several players and whispering a statement in the first players ear which they in turn whisper into the next players ear. The last person then tells what he thinks he was told. The results can be hilarious because the message is warped by each player interpretation.

    This is the method used by every study quoted. The "Mayo" study is
    a researcher's interpretation of
    a nurse's interpretation of
    a transcriptionist's interpretation of
    a medical record's interpretation of
    a medical transcriptionist's interpretation of
    a doctor's interpretation of
    a patient's interpretation of their symptoms."

    Hmmmmm..... By the time one adds the ever increasing ambiguity, abstraction and conjecture at each level of interpretation to compensate for problems with data all investigators work with, it is easy to understand why researchers are making such slow progress in reducing mishap risks.

    Review Diane Vaughan's inquiry into the Challenger launch decisions and how use of original data upset prior findings if you want more evidence of this kind of junk science.


    Topic:  Investigation-based safety management research . posted 11/9/96; project completed. See report.

    Go to Project listing for research project description.

    Comment :

    Your work has the potential to lead to advances in the state of the art by demonstrating results that could be achieved with alternative thinking and methods. Given your time constraints, however, that may be difficult.

    I would be pleased to post your inquiry on the web site and highlight it in the What's New section. Do I have your permission to do so? it?

    Some results of prior research that may be of interest.

    • 1. You may wish to view the draft commentary on the Root Cause Analysis fad by clicking on 5 in the row of numbers at the end of the Feedback Reports, section 4 of the IRR. The major problems with RCA are subjective judgments which impair quality control and replicability of findings, the ambiguity of what behaviors must be changed, and what will be better behaviors.

    • 2. Vaughan's book referenced in Section 6, under Books, is must reading if you are going to address management shortcomings. It is imperative that you use original sources for your data about who did what and its causal relationships to what happened, and avoid the retrospective fallacy that she describes so well.

    • 3. The DoE program reference in Section 6 under Internet sites describes the conceptual framework of a management learning system that may be of particular value. L Benner 11/5/96

    Comment :

    Your inquiry from and answer to Ms Priestley are noted with some bemusement. Why is it that "Safety Management" practitioners insist on attempting to administer cures without first diagnosing the illness? Ms Priestley and others seem determined to make baby soup with the bath water, without first determining how the poor tyke got into the pot in the first place. I J Rimson

    Response

    Mr Rimson asks, 'why is it that 'safety management' practitioners insist on attempting to administer cures without first diagnosing the illness?'

    With respect to his comment I think that he has misunderstood our intentions, and perhaps I should make myself clearer.

    The introduction of safety management systems is usually undertaken with the intention of prevention rather than cure. However, we do not live in a perfect world and some incidents will occur. The requirement for accident investigation and learning from past mistakes is therefore a fundamental part of any safety management system. I do not see a conflict between the two areas as Mr Rimson seems to. If in the past accident investigation has not delved deep enough to identify the problems that need curing then this is a different matter.

    The problem to address is not safety management per se, rather to develop accident investigation techniques that are accessible and practical to use by personnel within industry. To quote Mr John Kingston-Howlett from his EuroMORT site 'most organisations fail to learn through their accident and near miss experience and an important reason for this is a lack of systematic investigation methods and post investigation change process'.

    We are seeking to 'diagnose the illness' by developing an RCA tool that will structure accident investigations to probe beyond the typical accident classifications to identify the 'latent failures' within the organisation.

    By investigating near miss and property damage events we will also be trying to identify problems in the system before they lead to harm. To this extent we may be attempting to administer cures before the symptoms manifest themselves too seriously.

    If we can develop a methodology which can be adopted by safety personnel within organisations that encourages them to systematically examine the organisational issues surrounding an incident, then I feel that we will have made some progress in encouraging industry to seek more detailed information upon which to make their diagnosis.

    Yes we are seeking to map route [root?] causes to a safety management model, the purpose of which is twofold. Firstly, to provide a first port of call for advice and guidance on how to go about addressing the issues uncovered, and secondly for the particular purpose of our study which is examining where the most cost effective interventions may be made.

    The method should provide a framework for identifying causal relationships and a framework for seeking guidance to put issues right, that is not to say that it will constrain people. K Priestley ( 4 Nov 96)


     

    Topic:  Investigation Utility

    Comments: Charles Hoes, former president of the System Safety Society. (posted 12 Oct 96)

    I went back to it this morning and read some of the materials (the first time I just skimmed through it to see what topics were included). I still haven't taken the time to ponder the issues raised, but it is much more thought provoking than I had anticipated.

    I have done very little "investigation" work during my career (with the exception that I periodically get involved in expert witness cases were I have to make up my mind about what happened, and in a few instances were accidents have happened on projects that I was involved with). Upon reflection, I have to admit that it didn't seem to be related to my profession as a system safety engineer. I obviously understand that the results of accident investigations have the potential for providing "lessons learned" information that I could then use in my work. While this is an obvious value of the investigations, I have generally been unable to use many of the results of investigations because they either are unavailable, are not written in a way that I find useful for my work, or are wrong. (I am convinced that well over 75% of the investigations that I have reviewed are wrong in that they didn't accurately reflect the events and end up with incorrect "causes" to the problem.) I find that I usually use the investigation report as a guide to the pure fact that an event occurred. I then use my own knowledge, and the advice of other knowledgeable people, to figure out what probably really happened and how to best incorporate that information into future design decisions.

    Upon re-reading my last paragraph I realized that I do participate in "accident investigations," but that participation is unfortunately hampered because it usually comes after all evidence is gone. I am clearly looking for something different than what the original investigators were working on or I would have found their reports to be useful for my work.

    The reason that I bring all of this up is that the questions that are posed on the web site seem to be closely related to this problem of not finding the results of traditional accident investigations to be useful for the design and development of future systems. (I have never been involved in a "big" investigation such as I am sure follow air liner accidents. Therefore I can't comment on whether or not the results of those would be useful. I have only worked on projects that are "invisible" to the public or press.)

    Charles Hoes, former president of the System Safety Society. (posted 12 Oct 96)

    Comments? Send to Administrator at iprr.org

    Comments:

    Thank you for letting me have the URL of your site. Having reviewed it, I absolutely agree that we have common interests. I was very interested by your "wish list" and agree strongly with your identification of areas of need. Oddly, I have recognised similar concerns through my PhD research on the study of safety management systems -in other words, the problems you note are, in my view, generalised to the whole gamut of studying safety management in organisational systems. I have various plans for Post-PhD research (two weeks until I submit the thesis). One is to study the cognitive process of investigation with the primary aim of learning about the acquisition and application of investigator's conceptual models of the systems they seek to understand. Further, your comment on data-languages is well-taken, and has been the subject of considerable discussion in my current work: the need for a consensual language (AND set of concepts) is much greater than seems to be credited in the literature and amongst practitioners. There is rather too much bouncing from fad to fad, rather too much searching for holy grails and generally more heat than light generated. Thus it is very good to see the calm voice of reason evident in your wish list!

    John Kingston-Howlett, Aston University in Birmingham (UK) (posted 12 Oct 96)

    Comments? Send to Administrator at iprr.org


     

    Topic:  Root cause analysis evaluation . (8/23/96)

    Michal Tamuz at the University of Texas inquires whether we know of any cites that talk about the disadvantages ( ed note -or advantages ) of using root cause analysis. Can anyone out there help with either research report references or analytical anecdotal feedback?

    Comments

    It is interesting to visit the IPRR every few days to find out what is happening. After reading Michal Tamuz' inquiry about the possible disadvantages of using root cause analysis, I immediately thought of my major complaint in this area. That is, that there are so many different ideas about what root cause is. Many proponents of root cause analysis consider such things as operator error, missing guards, and other point-of-operation factors as being root causes. If they are root causes, why do any analysis since these can be easily identified without any complex methodology. To me, root causes are those system defects that allow these point-of-operation problems.

    One other thought, the discussion about failures of investigations was interesting and I agree. A second part of this problem is that regardless of the quality of the investigation, there is often a problem that proper corrective actions are not implemented or that lessons learned are not disseminated to prevent similar occurrences elsewhere. (See also Section 2, item 11 Action on Recommendations. Ed.)

    Leon Horman, ex-SSDC researcher

    Comments:

    (Re screening comments) if the site is supposed to support quasi-academic-mindedness then the thin skinned shouldn't apply. Nonetheless, there is definitely a place for "exchanges of views" even if disparate, provided certain rules of civility are followed: (1) No Ad Hominem Arguments; (2) Criticism should be constructive, if possible, and in any case should cite sufficiently authoritative sources to support one's arguments; (3) Be prepared to have your criticism stuffed up your nose a nickel at a time by somebody who's (a) read, (b) researched, (c) investigated or (d) sat before more official inquiries than you have. The point being that (re Tamuz) there's no reason to engage in unflattering remarks. As you note, merely mentioning the questions that we know they can't answer; e.g., How does Root Cause identify the specific interventions which can effectuate a behavioral change which will obviate recurrence of the mishap? will evoke the usual Root Cause practitioner's response to that question: "Gaaargh?", accompanied by intra-nostril digitation. I always like the approach of stating all the things which a good investigative methodology should do, then smile and ask - "Does MORT cover those bases?"

    I think a good solid strategy/policy statement on the masthead would let folks know what the rules were at the top. Remember, there's so little investigation research anyway that the likelihood of anyone having studied more than 10% is pretty unlikely. Who knew about Johnson and Zotov and Lepisto and other people's work before this?

    Petri Nets came up at SSS meeting in Albuquerque in August, and the general consensus seems to be that they don't do the job better than other models, plus being unwieldy for any but very simple applications. E.g., they might be useful for analyzing interactions in the "final critical minutes" but wouldn't be much use in the "Years Before"/"Months Before"/"Days Before" reconstruction in Dmitri's paradigm.

    I wouldn't reject anything out of hand because we just don't have that much experience with alternative models yet. I think that whatever emerges as an improved approach will necessarily be a composite at the outset. The BIG THING is to establish some sort of evaluative protocol which will sort out the apples from the chaff.

    I. J. Rimson, Editor, ISASI Forum (Posted 12 Oct 96)

    Comment

    Re RCA review, you can sign on to the IRR, and view or download an  unpublished draft review . Have you found any research papers that form the basis for RCA? I have seen a paper from the DuPont/Savannah River facility by Mark Paradise (I think) that was the first RCA work I know about. Have you referenced this or other works?

    SINTEF in Norway is trying to apply RCA for recommendation development after developing a solid description of what happened using the STEP investigation methodology.

    L Benner: (Administrator) (Posted 12 Oct 96)

    Comments? Send to Administrator at www.iprr.org


    Topic:  Investigation objectives vs recommendations

    Comment : by Jim Stewart - related to comments by Charles Hoes Investigation Research Needs

    One of our weaknesses is the focus/over emphasis on investigation of failures or losses as a way of expending limited resources (Tongue in cheek). We put the majority of money and resources in preventing the second accident and little into preventing the first from an investigative point of view. We seem to maintain the dim view that we need an event before we investigate something (like a system). In my case, as the head of System Safety for Transport Canada I had 74 people and a $5M budget to prevent while the Transportation safety Board of Canada had in the order of 250 people and $20M. We computed once that each recommendation we had received from the TSB cost the Canadian government in the area of $1M since 1984 - a simplistic way of viewing the issue, I know, but effective nevertheless.

    I often just wanted a good factual report with no recommendations and no cause statement which would allow (nay, require) the accountable people to take action in the best possible way to fix a problem. If they did nothing, then the accountability was clearly on their heads. By investigative agencies developing causes and recommendations they spread out accountability even though that is the opposite to their intention. The manager is home free if the investigative authority makes the least mistake in these areas as everyone attacks the definition of cause and the quality of the recommendations.

    There is a better way and it follows Dr. Vernon Grose's methodology which I was able to apply so well in Transport Canada for six years. In Transport Canada we developed the methodology of Dr. Grose and investigated a number of systems which had not experienced a loss. We even investigated some which had.

    As Steve (Corrie) said, the focus of prevention needs to switch to investigating the system within which the loss occurred (or, ideally, has not yet occurred) (my words) or the investigative organization must focus on why the system did not prevent a loss not on the cause of the event (Steve's words). Either statement works better for me than the traditional methods we follow now.

    So, my position is to have investigators forget about recommendations and statements of cause (we can all figure that out if we need to) and concentrate on the quality of their gathering of the facts and their analysis of those facts. As we discussed few investigators are taught or given the tools to conduct proper analysis of what they have found. And, once the bureaucrats and recommendation drafters get involved, as they now seem to do with increasing and frightening regularity, the investigator is left behind in terms of influencing the final product. I also believe we rely too much on event investigations obviously and not enough on system investigations.

    I would be interested to know what your participants think of these relatively radical views.

    My only caveat is that I spent 12 years on the receiving end of all safety recommendations from the TSB and had to try to get Transport Canada to act on them as required. So, I have some experience in the area of discussion.

    Jim Stewart


    Rules for comments you want to see posted

    Please stick to these rules for your comments :

    1. State whether or not it is ok to post your comments

    2. Don't submit any Ad Hominem arguments

    3. Criticism should be constructive

    4. Support your points with authoritative sources or objective observations from your personal experiences

    5. Be prepared to have your criticism criticized.

    6. To help Administrator, number or headline each of your points or arguments

    The Administrator reserves the right to select and edit reports to condense or make them more readable, but will try to avoid changing the substance of any posting