Why systems fail LO10038

Rol Fessenden (76234.3636@CompuServe.COM)
18 Sep 96 07:33:48 EDT

A couple of good book reviews in Scientific American seemed relevant to a
number of discussions we have had about why systems fail. The reviews were
by John Adams from University College London. The books are "The Logic of
Failure" by Dorner, and "Why Things Bite Back" by Tenner. Much of what
follows is either a direct quote or a minor rewrite of Adams' material.

Dorner offers a 5-step problem-solving process which Adams uses to make
his 0wn points about how difficult they are to achieve.

The steps are:
. Have clear goals. This is, of course, the first problem. Rarely can
indiviuals, let alone societies, define clearly what they want. We are
ambivalent. On this forum, for example, we saw a few months ago that
there were no self-evident truths.
. Second, have a model. Again, according to Adams, there are no complete
models explaining reasonably complex situations.
. Third, predict the future. As a consequence oF inadequate models and
inadequate information prediction, routinely founders.
. Fourth, plan, decide, and act. Certainly at the societal level and
frequently in smaller organizations, so much uncertainty was introduced in
the first three steps that we are paralyzed.
. Finally we must be prepared to acknowledge that a solution is not
working.

As Adams goes on to say, how can anyone manage these tricks in a nonlinear
world characterized by conflicting values and by billions of people acting
on each other and their environment--and in the process constantly
changing each other and the world?

Tenner illustrates our inability to elucidate clear goals. Even in
sports, arguments rage over the use of new equipment that would improve
safety and performance.

He also points out that things tend to "bite back" vhenever we interfere
with them in ignorance, which is most of the time. Insect pests refuse to
be controlled, the paperless information revolution has proliferated
paper, flood control actually increases the damage caused by floods,
helmets and protective gear make football more dangerous than rugby, roads
to relieve congestion are clogged with traffic, and clear, straight roads
have the highest fatality rates.

He goes on to say that the more we introduce conspicuous safety measures,
the greater the likelihood of a Titanic-style disaster in which the safety
of the ship becomes the greatest single hazard to the survival of its
passengers.

Apparently we cannot stamp out failure because "things" are densely
interconnected. Whenever we change one item, we set in motion innumerable
other unpredictable changes. Apparently this is a major reason why
systems fail.

-- 

Rol Fessenden LL Bean, Inc. 76234.3636@compuserve.com

Learning-org -- An Internet Dialog on Learning Organizations For info: <rkarash@karash.com> -or- <http://world.std.com/~lo/>