System Dynamics of Risk Perception LO48

Daniel J Yurman (djy@inel.gov)
Thu, 9 Feb 1995 08:00:39 -0700

THE SYSTEM DYNAMICS OF RISK PERCEPTION

Risk Perceptions, Mental Models, and Circuit Breakers
Or, What to Do about NIMBY (Not in My Backyard)

With regard to comments about the system dynamics of the failure
of engineered systems and public perceptions of risk, the
following talking points based on Senge's book "The Fifth
Discipline" and two classics about risk perceptions may be
helpful. This material was developed as the outcome of a
training exercise on system dynamics at the Idaho National
Engineering Laboratory, Idaho Falls, ID. Naturally, the standard
disclaimer applies.

Dan Yurman, Idaho National Engineering Laboratory, djy@inel.gov
PO Box 1625, Idaho Falls, ID 83415 208-526-7104 fax; 0876
---------------------------------------------------------------

There have been a number of articles printed in the news media
about risk and public acceptance of risks from various
technologies, e.g. nuclear waste, chemicals and hazardous waste,
Navy spent fuel, etc. In particular, it is public fear of the
failure of large engineered systems which drives risk
perceptions. I think it's worth reviewing some of the basics
about these risk perceptions, and especially the application of
system dynamics to them.

These talking points are based on the following references for
those who wish to develop their own conclusions.

"Perceptions of Risk," Slovic, Paul, _Science_, 4/17/87. Vol 236,
pp. 280-285.

"The Fifth Discipline," Senge, Peter, Doubleday, 1990.

"Technological Risk," Lewis, H.W, Norton, 1990.

----------------------------------------------------------------
I am responsible for any errors in the representation or
application of these ideas and welcome comments or suggestions to
improve the analysis.

OBJECTIVES

* Provide a basis for understanding and anticipating public
perceptions of hazards. [Note: risk perceptions are mental
models.]

* Improve communication of risk information among technical
experts, lay people, and decision makers.

BACKGROUND

* The development of chemical and nuclear technologies has
been accompanied by the potential to cause catastrophic and
long-lasting damage to the earth and to the life forms that
inhabit it.

* The mechanisms underlying these complex technologies are
unfamiliar and incomprehensible to most citizens. The most
harmful consequences of these technologies are such that
learning to mitigate or control them is not well suited to
management by trial-and-error. The public has developed
increasing levels of dread of the unknown consequences of
complex technologies.

* The public is well aware that economic and political
pressures during the design process in complex systems may
lead to systems being built and operated near the edge of
the safety envelope. [Senge - Eroding goals]

* Some systems, once built, represent such significant
investments that it is nearly impossible to walk away from
them regardless of risks. [Senge - Yesterday's solutions are
today's problems.]

* Those who are responsible for human health and safety need
to understand the ways people think about and respond to
risk. Perception and acceptance of risks have their roots
in social and cultural factors and not in science.

* The result is that some risk communication efforts may be
irrelevant for the publics for which they are intended
because these "publics" have hidden agendas. Also, the
public may be raising the issue of risk to human health and
the environment as a surrogate for other social, economic,
or political concerns.

-- For example, citizens in a well-to-do subdivision
object to the "risk" of electromagnetic radiation from
a 235 Kv power line. In fact, their real objection is
possible impact on property values from the the
industrial aesthetics of high-tension power lines.

MAPPING RISK PERCEPTIONS

* Risk perceptions are mental maps composed of attitudes,
beliefs, assumptions, and judgements. Following is an
example of the "Not in my back yard," or NIMBY mental map.
[Senge - reinforcing, vicious loops.]

* Attitude govvernment science is not trustworthy

* Belief government serves special interests, not the
public

* Assumption you can't fight city hall

* Judgement whatever it is the government is proposing to
do, it's not going to be in my backyard

Note: Analyses of "environmental racism" point out that
"dumps" of all kinds wind up next to or located in
communities which are least able, politically, to assert a
NIMBY defense.

* Disagreements about risk perceptions do not change as a
result of better data becoming available and being
disseminated to the public. People have a hard time
changing their opinions because of the strong influence
initial impressions, or pre-existing biases, have on the
interpretation of new information. Also, the method of
presenting the new data, e.g., as mortality or as survival
rates, can alter perceptions of risk.

* Generally, the gap between perceived and desired risk levels
suggests that people are not satisfied with the ways the
market, or regulatory agencies, have balanced risks and
benefits. Generally, people are more tolerant of risks from
activities seen as highly beneficial, but this is not a
systematic relationship.

* The key factor regarding acceptance of exposure to risk
appears to be the degree to which a person chooses that
exposure in return for a perceived level of benefits. The
relationships between perceived levels of benefits and
acceptance of risks are mediated by factors such as
familiarity, control, potential for catastrophic
consequences, and equity.

* In the case of nuclear power, spent fuel, and radioactive
waste people's deep anxieties are linked to the history of
negative media coverage. Also, there is a strong
association between public attitudes about nuclear power and
anxieties about the proliferation of nuclear weapons.

Accidents as Signals

* The impact of accidents can extend far beyond direct harm.
An entire industry can be affected regardless of which firm
was responsible for the mishap.

* Some mishaps cannot be judged solely by damage to property,
injuries, or death. Some events, like Three-Mile Island
(TMI), can have ripple effects on public perceptions of
risks leading to a more hostile view of complex technologies
in general.

* The signal potential of an event like TMI, and thus its
social impact, appears to be related to how well risks
associated with the event are understood. The difference in
perceptions between a train wreck and a nuclear reactor
accident is that the wreck is seen as a discrete event in
time while the reactor problem is regarded as a harbinger of
further catastrophic mishaps. The systematic relationship
is between degree of unknown dread of the consequences of
the accident and the degree of subsequent irrational fears
of future catastrophes.

Risks & Benefits

* Firms conducting risk assessments within the framework of
cost - benefits analyses often fail to see the "ripple"
effects of worst case scenarios.

* For example, Ford Motor Co. failed to correct a design
problem with the gas tank of its Pinto compact care. A
cost-benefit analysis indicated that corrections costs
greatly exceeded expected benefits from increased safety.

* Had Ford looked at public risk perceptions of auto safety,
the analyses might have highlighted these defects
differently.

-- Public perceptions of auto crashes regarded the risk of
fire as a very high order problem involving
considerable dread.

-- Ford ignored potential higher order costs such as
damage claims from lawsuits, damaged public reputation,
lost future sales, and diminished "good will" from
regulatory agencies.

THE LOGIC OF RISK PERCEPTIONS

The logic of mental models with regard to risk perceptions is
illustrated by the following notes based on Senge's work:

1. Structure influences system performance

IF: structure influences system performance, and;

IF: mental models - attitudes, beliefs, assumptions,
judgements - are part of the structure;

THEN: Mental models influence system performance. Risk
perceptions are mental models because they are
based on social and cultural factors such as
attitudes, beliefs, assumptions, and judgements

2. The easy way out usually leads back in.

IF: culture is the dominant collection of shared
mental models operating in society, and;

IF: risk perceptions, which are mental models, have
their roots in social and cultural factors, and
not in science;

THEN: some risk communication efforts based solely on
scientific data will fail since they do not
address mental models which are the basis for risk
perception.

3. The harder you push the harder the system pushes back.

IF: both our private and shared mental models are
always flawed and can get us into trouble when
they are taken for granted, and;

IF: levels of dread, in terms of perceived risk of
complex technology, are reinforced by irrational
fears caused by the unknown but potentially
catastrophic effects of new technologies;

THEN: inappropriate mental models about complex
technologies may be reinforced, rather than
mitigated, by additional "marketing" efforts to
promote new technologies.

DEFINING VARIABLES FOR RISK TAKING

Variables are defined as elements in a system which may act or be
acted upon. A variable can move up or down in terms of
intensity, duration, absolute or relative values, etc., but it's
movement is measurable.

Slovick points out there are four areas in which variables are
defined for mental models at work in shaping risk perceptions.
Following each variable definition is a list of factors which
further define them.

* The degree of voluntary acceptance of the risk, e.g.
drinking coffee (caffeine) v. second hand smoke. (who makes
the decision for exposure to the risk)

* Controllable?
* Consequences not fatal for individuals or groups?
* Equity in choice, degree of exposure?
* Low risk to future generations?
* Risks easily reduced or mitigated by individual
choices?
* Risk decreases over time as more knowledge becomes
available?

* The level of dread of the unknown the person has about the
risk, e.g. thermonuclear war v. car accident. (obliteration
of the collective v. individual survival)

* Totally uncontrollable; e.g. Pandora's box?
* Catastrophic results?
* Consequences fatal?
* No equity or choice, random exposures to risks?
* High risks to future generations?
* Risk increases over time regardless of what is known
about it?

* The amount of knowledge the person has about the risk and
especially its consequences, e.g. inhaling pesticide residue
v. drinking alcoholic beverages. (imprecise science v.
known, quantifiable data)

* Risks / consequences observable by trial and error,
experimentation, or measurement?

* Those exposed realize the dangers?

* Effects / consequences separated in time and space,
e.g., harm to future generations?

* Risks known to science, or exist in realm of
"folklore?"

* The degree of control the person has to prevent the
consequences of system failure, e.g., riding on a snowmobile
v. working in a coal mine. (individual control v. collective
control)

* Consequences known, capable of quantification?
* Effects immediate?
* Risk well known and understood by the public and
science?
* Solutions to mitigate risks work?

REAL RISKS - The Latent Failure Syndrone

* Numerous functions and services in large, complex systems
may be dependent on unrelated events. Large,
technologically complex, systems have "latent" failures
within them. These are failures which are only apparent
under a specific set of often obscure triggering conditions.
Examples include;

Nuclear Three Mile Island, Chernobyl
Space Challenger shuttle explosion
Industry Bophal
Environment Exxon Valdez oil spill

* While these disasters all have apparent triggers, in fact,
these failures are virtually never the result of a single
fault. Root cause analyses of each accident supports the
hypothesis of disaster caused by the "latent" failure
phenomenon.

* The risks of large system failures, with accompanying
catastrophic consequences, accrue to the system as a whole
rather than to individual components.

* Pressures during the design phase [ Senge - eroding goals ]
may lead to systems being built to operate near the edge of
the safety envelope.

* Logical redundancy is compromised by a lack of physical
redundancy. For example, separate communication channels
are carried in the same cable bundle. When a truck knocks
over the telephone pole, all circuits go down.

Application of the "Latent Failure" Syndrone

Nulear/chemical waste cleanup reviewed using Senge's models
reveal the following;

1. Today's problems come from yesterday's solutions

IF: public anxieties [mental models] about nuclear
technology are linked to dread of thermonuclear
war, and;

IF: existing nuclear wastes are the by-products of
weapons' production processes;

THEN: the public will irrationally but logically extend
it's original perceptions [ mental models] to
cover processes involving the management of the
wastes even though the cleanup is designed to
neutralize them.

2. The cure can be worse than the disease

IF: the public has an intuitive grasp of the "latent
failure syndrone" with regard to complex
technologies, e.g., nuclear weapons production,
and;

IF: the public's mental map include a paradigm that
"things blow up,"

THEN: the public will assume that the perceived risks of
cleaning up waste from nuclear weapons production
are no different than for the activites which
created the bombs in the first place.

The next step would be to construct a set of Senge's models to
find a way out.

Comments are welcome. See file header for contact information.