Cockpit Flight Recording LO24881

From: Phillip Capper (phillip.capper@webresearch.co.nz)
Date: 06/15/00


Replying to LO24751 --

Rick wrote: >
> >[Host's Note: And... What can we draw from this for organizations
> >and team other that airline flight crews? In the US, the NASA
> >Safety Reporting System is a feedback of incident data to support
> >learning. ..Rick]

to which AM de Lange replied
> Rick, it tells me that when two parties A and B want to establish
> the facts (and not the truth which is more than merely facts), they
> will not be able to do so when the two parties suspect each other
> of foul play. In other words, facts play no role in establishing the
> truth when the mind is negatively tuned. Third parties C may
> suggest solutions to overcome this dilemma, but if parties A and B
> persist with a negative mental tuning, such offers will lead lead to
> nothing. In a LO itself the Systems Thinking have to be such that it
> allows for parties A and B as well as third parties C to come
> together and establish the truth.

The problem here is that the 'truth' of what happens when an aeroplane
crashes depends on the mental models that are used in assessing the
'facts'.

Consider the following:

On 28 November 1979, an Air New Zealand DC 10 flew into Mount Erebus,
Antarctica, killing all on board. In the report of the subsequent
Commission of Inquiry (1981), Justice Mahon concluded:

        "In my opinion, therefore, the single dominant and effective cause
of the disaster was the mistake by those airline officials who programmed
the aircraft to fly directly at Mount Erebus and omitted to tell the
flight crew."

The Ministry of Transport's Aircraft Accident Report (1980), conducted in
accordance with internationally agreed procedures, produced a linear
rather than systemic analysis and concluded, in contradiction of Justice
Mahon, that:

        " The probable cause of this accident was the decision of the
captain to continue the flight at low level toward an area of poor surface
and horizon definition..."

These two views have remained essentially irreconcilable ever since - yet
they used the same evidence.

While Mahon was a pioneer, it was in Canada that the catalyst for
significant change occurred In 1989 an Air Ontario airliner crashed at
Dryden. The immediate cause of this accident was the fact that the plane
accumulated snow and ice on its wings during a delay in obtaining takeoff
clearance, an event which tipped a preceding series of errors into a
catastrophic event.

The Dryden accident investigation (1992) went even further than Mahon. It
identified a range of contributors to an accident in which all involved -
flight crew, ground personnel and company equipment - performed in a
highly dysfunctional manner. These factors included:

job instability following a recent company merger
high employee turnover
low morale
poor company support for operational personnel.

These immediate factors were, according to the Commission, triggered by 17
inadequate corporate processes, which included such items as:

disparate allocation of resources to production and safety activities
inadequate safety management
inadequate change management
deficiencies in operations and maintenance
deficient monitoring and auditing
deficient handling of information
deficient inspection and control
inadequate purchasing of spares
low motivation
inadequate policy making
inadequate goal setting
deficient checking.

The Erebus and Dryden commissions of enquiry have had a profound effect on
thinking about safety issues in high risk operational environments (as did
Three Mile Island). But the effect on organisational practice has been
much more patchy. The implications of the systemic interpretation of the
Erebus and Dryden facts are that accident prevention requires systems
thinking and management practices consistent with theories of
organisational learning. But the interpretation of the formal New Zealand
accident enquiry leads in a different direction - towards individual blame
rather than systemic accountability, towards behavioural approaches to
training and personnel licensing, and towards standard operating
procedures rather than organisational learning. The answer to Rick's
question is that if one model prevails critical incident reporting is part
of the system. If the other prevails then concealment reigns.

That there remain deep divisions on these matters is demonstrated by my
earlier message, in which I reported the forthcoming manslaughter charges
against a New Zealand airline pilot. Showing that the other side of the
debate is alive and well is the establishment in British law of the crime
of corporate manslaughter.

At present there is no 'truth' about any aviation accident. There are only
different truths which arise from whichever mental model you bring to the
analysis of the event.

Phillip Capper
WEB Research
Wellington
New Zealand

-- 

"Phillip Capper" <phillip.capper@webresearch.co.nz>

Learning-org -- Hosted by Rick Karash <Richard@Karash.com> Public Dialog on Learning Organizations -- <http://www.learning-org.com>


"Learning-org" and the format of our message identifiers (LO1234, etc.) are trademarks of Richard Karash.