Normal Accidents LO12109

Myers, Kent (myers@carsoninc.com)
Tue, 21 Jan 1997 17:06:58 -0500

Replying to LO12086 --

As an aside, I'd like to give some background on the phrase "normal
accidents". Sam knows this context, but others might be in the dark.
Please continue discussing culture.

Charles Perrow examines systemic characteristics of large scale
technologies whose failure can be devastating. "Linear" technologies,
such as dams and ship navigation, are potentially controllable. Ways have
been found to organize the control of dams, and dams rarely fail for
either human or technological reasons. Ship navigation is generally the
same type of technology, but organizations fail in this area. Perrow tells
some fascinating stories about headstrong captains who know for hours that
they are on a collision course, yet collide anyway. He argues that ship
navigation is correctable with better organization.

Other kinds of technology are "tightly coupled". This includes chemical
process plants and nuclear reactors. He finds that interactions in these
systems can never be fully understood, and that organizations don't have
the capacity to stop runaway problems. Accidents are "normal" in such
systems. This isn't much of an issue if the tightly coupled systems are
small scale, but modern society keeps building big ones. One might think
that chemical processing has been with us for a long time, so it must be
safe, but Perrow gives several cases of chemical plants that had no
trouble for many years and then exploded for no obvious reason.

Perrow isn't entirely fatalistic. Big linear technologies are fine with
him as long as they have good organization, and it is possible to develop
good organization. But some big technology is bad due to systemic
characteristics. Organization can mitigate risks in these systems, but
significant risk remains. Failures in these systems are almost always
blamed on human operators, which he finds foolish. Regardless of whether
an operator makes a mistake, it is the design of the system that locks out
corrections and creates the devastating consequences. An organization can
never be so perfect as to keep up with this type of technology.

As I recall, Perrow discusses some ways that have been tried to improve
the designs. For example, there are new designs for "inherently safe"
reactors. (Ironically, these reactors can't be build because there is
reluctance to deal with the unknown in this kind of technology). But he
finds that some processes just can't be linearized, and he recommends a
social choice to limit those technologies.

--

Kent Myers myersk@us.net

Learning-org -- An Internet Dialog on Learning Organizations For info: <rkarash@karash.com> -or- <http://world.std.com/~lo/>