Forensic Engineers & Investigators
Professional Critiques
From time to time we choose to comment on high profile cases or issues, if we have concerns about the quality of technical information that is publicly reported by investigation agencies or other entities. Sometimes the invitation to comment or the comment itself is by another entity, but is based on information uncovered by our investigative work.and active associates.
In 2017 Andrew McGregor presented a paper at the International Society of Air Safety Investigators conference in San Diego. The paper was co-authored with Dr Barry Hughes and Captain Simon Tapp. Although the paper was reviewed by a team of ISASI reviewers prior to the conference, ISASI decided against making it available to delegates after the presentation was heard. The reasons why ISASI changed its mind and chose not to make it available, are not fully understood.
Paper Abstract:
We all agree that lessons should be learned from accidents and these should be implemented in order to prevent future accidents. We consider four significant cases in which, we argue, important lessons were not learned.
We argue that although thinking patterns commonly used in investigations may be grounded in experience, buttressed by high confidence levels and endorsed by others, they can be wrong. In explaining why, we introduce the concept of cognitive or thinking illusions and liken these to optical illusions, illusions to which not only pilots, but investigators may also be vulnerable.
We link four major incidents and highlight examples of lost learning opportunities. We do so with reference to the Erebus crash [Air New Zealand; Nov 1979], the Perpignan crash [Air New Zealand; Nov 2008], the Bilbao incident [Lufthansa; Nov 2014] and the crash of AF447 [Air France; June 2009].
We begin by reassessing the Erebus disaster. We apply Dekker’s ‘failure drift’ model to show how one line of thinking — a pre-occupation with the last stages of the accident timeline—masked important contributing causes of the crash. We discuss why this may have been so. We also assess the various contributing causes in terms of their cognitive origins.
We extend this analysis to the Perpignan crash investigation reports and those of the Bilbao incident, similar incidents that occurred six years apart. We highlight the thinking illusions towhich the Perpignan investigators may have succumbed, and without which the outcome of AF447 may have been different. All these highlight lost opportunities of the original investigations, opportunities which if captured, could have prevented subsequent incidents.
We conclude by showing how greater sensitivity to thinking illusions could help to construct investigative processes which are more robust and effective, and that could really make a difference.
Fox Glacier Investigation bungle
According to the official finding published by New Zealand’s Transport Accident Investigation Commission (TAIC), the 2010 Fox Glacier air crash that killed nine people was caused by improper loading of the skydive plane.
In addition to the field investigation with Tom McCready that TV 3 filmed, Andrew McGregor provided a technical critique of the official TAIC report. Read the full story below.
Read Andrew McGregor’s Technical Critique: Fox Glacier Crash Investigation-Technical Critique.
Right to remain silent
This article comments on the attempt by the NZ CAA to charge Paul Jones of “Unnecessary Danger” in 2008. Pacific Wings highlight the problem of using information freely given during a safety investigation, as evidence to support a criminal charge. It is based on the investigative work undertaken during the Devon crash investigation which may be seen at this link.