Normal Accidents LO12058 - Joe's Jottings #68

pcapper@actrix.gen.nz
Sun, 19 Jan 1997 10:37:22 +1300 (NZDT)

Replying to LO12048 --

Joe Podolsky wrote:

"This process sounds pretty reasonable to my quality ears, but Gladwell
offers a different view. "Over the past few years," he writes, "a group
of scholars has begun making the unsettling argument that the rituals that
follow things like plane crashes or the Three Mile Island crisis are as
much exercises in self-deception as they are genuine opportunities for
reassurance. For these revisionists, high-technology accidents may not
have clear causes at all. They may be inherent in the complexity of the
technological systems we have created."

"Remember, this was written six months before TWA flight 800 went down.
All the king's people are still trying to put that one together again.

"The scholars that Gladwell mentions are sociologists. They are finding
evidence that "the potential for catastrophe is ... found in the normal
functioning of complex systems..." and that "... accidents are not easily
preventable." Yale University sociologist Charles Perrow calls these,
"normal accidents." "
-- end of quote --

He then went on to talk about the inherent capacity of complex systems to
fail in unpredicted ways.

Yes - all correct. But in my view Gladswell's 'New Yorker' piece
misunderstood the lesson and misunderstood the scholarship he cited,
leading to the 'fatalistic' view that Joe was tempted by.

It is true that there is an irreducible minimum level of human error. The
minimum can be calculated for a range of situations (rote repetition of a
simple operation, pattern recognition, novel situation, etc.) But error
does not necessarily equal catastrophe. The appropriate organisational
response is to have defensive procedures which enable interventions to
take place which prevent the error from developing into a catastrophe.
These procedures are often cultural in nature. for example - making it
culturally OK for an experienced theatre nurse to comment when the young
surgeon makes an error. Making it OK for the first officer to comment when
the captain makes an error. Or - as would have helped at Three Mile
Island - creating a culture where experienced engineers are prepared to
evaluate the analysis of an apprentice. In fact - defensive procedures
are ususally precisely those indicated by organisational learning theory.

The INCORRECT, but most common, organisational response. is to regard
human error as matters for discipline or training, underpinned by the
belief that if the discipline is firm enough, and the training good
enough, error can be eliminated. This fallacy leads to neglect of
defensive procedures which actually have the greatest potential to
minimise catastrophe. (It remains true that good discipline and good
training are necessary parts of any error minimising strategy).

The next organisational response is to ensure that human errors that do
happen are in fact occurring at the irreducible minimum rate. We now know
that error frequency multiplies according to a range of environmental
factors. Some of them are personal to the error commiting individual eg.
marriage breaking up. Some of them are personal to the individual, but are
susceptible to good management practice eg drug problems, tiredness
through work pressures or shift arrangements. Some of them are ergonomic
or are to do with fundamental human perception, eg position of instruments
in a cockpit layout. Some of them are cultural eg "you keep your mouth
shut round here until you've been on the job for 5 years." Some of them
are almost entirely organisational, eg, unclear lines of authority,
contradictory organisational goals ("we must not compromise on safety, and
we have strict cost control in all areas of operation.").

Here is the counter intuitive finding from research. THE BIGGEST ERROR
MULTIPLIERS ARE MOSTLY THE ONES TO DO WITH CULTURE AND ORGANISATIONAL
PROCESSES AND STRUCTURES.

In these circumstances an organisation (if it thinks about these things at
all) is often faced with a trade off between error-promoting processes
which are difficult to change because of other considerations, and safety.
In such circumstances the appropriate response is to strengthen the
related defensive procedures.

The paradoxical part of all this is the one most close to the hearts of
members of this list. VIOLATIONS are ambiguous. People commit violations
for all sorts of reasons, many of them to do with exactly the same factors
mentioned above in respect of error. But: (1) violations of existing
procedures can also be INNOVATIONS, and (2) in many high risk situations
such as an airliner flight deck, when a novel situation arises survival is
based on the capacity of the experienced operator to improvise and
innovate and break out of standard operating procedures. This is the hard
part...........

One final point - some of you may have reservations about the idea of
paying attention to inexperienced or less highly trained operators in a
particular setting. The research evidence is clear and unequivocal -
'experts' tend to outperform 'novices' in standard, repetitive and
predicted situations. 'Novices' tend to outperform 'experts' in novel,
unexpected, or unpredicted situations. Dewey was the first to note this.
It has been confirmed many times since in a wide range of work settings.
The organisational trick is to create an operational cuture in which
'experts' feel OK about making full use of ALL the resources available
when a problem arises. One of my most powerful reasons for subscribing to
the principles of organisational learning is that an OL setting encourages
precisely these sorts of relationships.

Phillip Capper
Centre for Research on Work, Education and Business
Wellington
New Zealand

-- 

pcapper@actrix.gen.nz

Learning-org -- An Internet Dialog on Learning Organizations For info: <rkarash@karash.com> -or- <http://world.std.com/~lo/>