> How do *you* help a group bring out their mental models?=20
Margaret McIntyre <MargMcI@aol.com> expanded my question nicely
> ...how to reveal people's mental models in a way that=20
> people engage rather than defend or attack?
Robert Levi <email@example.com> provided some of the background
when he introduced the "Ladder of Inference" (Nice graphic,
> ...my belief system takes
> "observable data" and begins to move up the ladder of inference=20
>7 /---/ I take ACTIONS based on my beliefs
>6 /---/ I adopt BELIEFS about the world based on my conclusions
>5 /---/ I draw CONCLUSIONS based on my assumptions
>4 /---/ I make ASSUMPTIONS based on the meanings I added
>3 /---/ I add MEANINGS, both cultural and personal
>2 /---/ I select DATA from what I observe
>1 /---/ OBSERVABLE DATA (as a camera might capture it)
(I've added numbers to the steps in Robert's diagram so I can refer to=20
I'd like to provide a skeleton outline to which other people can add.=20
Please, I'm not going to do the definitive exposition here, but hopefully
help the discussion move forward. Here are the main points I usually make
in introducing the topic of mental models:=20
1) Mental Models are our mental pictures, conscious or unconscious, about=
what *is* and how the world works. In Robert's chart, these appear at=20
steps 4-6, perhaps at step 3. So, our task in bringing out mental models=20
is to bring out the thinking that leads to the actions being taken.=20
2) In the step #2 "Select Data" there is a *lot* of filtering, most of it=
unconscious. Studies show that we actually take in relatively little of=20
the data available to us from the world. This is part of our great=20
efficiency as human beings -- if we actually took in all the data coming=20
at us, we'd be drowning in it! And, it=D5s a potential weakness.
3) In steps #3 - #6, we add a lot from ourselves to the data. And, our=20
mental models are adjusted by the processed data. All this happens very=20
quickly, and automatically. We are often not conscious of the process.=20
4) This is neither good nor bad -- it=D5s just the way our brain works. In=
many ways, it's very, very efficient and helps us be effective in the=20
Now an important point:
5) We treat our conclusions and attributions as fact without being=20
conscious of doing so. By confusing conclusion and attribution with fact,=
we expose ourselves to dangerous possibilities.
6) One of the dangers: Mental models are the primary determinant in the=20
filtering at step #2 -- We actually take in only a small part of the data=
available, and we tend to take in the data that is consistent with our=20
mental models. Unless we are interacting with people who have different=20
views, this process is a reinforcing loop, a dangerous spiral which can=20
take one line of thinking to excess.
The most obvious examples of mental models are biases -- Consider for
example biases about race or gender. Growing up in Ohio, I met some people
I considered racist. But, I'm sure to them their conclusions appeared to
be based on real data. Biases are most dangerous when we are not aware of
the bias, not aware of the cycle: mental model --> selective perception
--> strengthen mental model.=20
The facilitator's task of bringing out mental models is demanding: we are=
trying to help people become aware of their thinking.
Here are two levels of attack on mental models that I use frequently:
Presumption of Rationality
When a group is complaining about the actions of another group...
This might be sales talking about the customer service people, internal=20
consultants talking about how the management just doesn't get it, or John=
Q. Citizen talking about the effectiveness of Congress.
I ask, "OK -- assume for a moment that they are acting reasonably and=20
rationally from their point of view. What line of thinking would take=20
them to the (conclusions, actions, policies, etc.) that they have chosen?"
This is simply asking people to consider the thinking behind the actions,=
usually just the conscious thinking.
The presumption of rationality gets things started, but often you need to=
My primary guide in going further are the exercises developed by Chris=20
Argyris, Don Schon and others. See references below.
The Left Hand Column exercise helps people become more aware of their=20
own thought process leading to conclusions and action.=20
Several of the people who have studied with Argyris have developed=20
protocols which seem very helpful.
This is an experiment. I'm trying in this to layout the skeleton which=20
has guided us (me, my IA colleagues, and our friends in the field). I'm=20
doing this in the hope that others will chime in with additional meat --=20
more specifics, additional techniques, refinements coming from either=20
theory or practice. I also hope that, by sharing my mental model at this=20
level, others can show completely different avenues with merit, ones that=
might be blocked for us by our affection for the model we know.
So, the question is --> How do *you* bring out mental models in a way=20
that people engage, rather than defend or attack? And, what is the theory=
for how your methods are effective?
_The Fifth Discipline Fieldbook_, pp 243 - 252 on Left Hand Column=20
exercise and the Ladder of Inference
_Overcoming Organizational Defenses_, Chris Argyris (1990, Needham MA:=20
Allyn and Bacon)
_Action Science_, Chris Argyris, Robert Putnam, and Diana McLain Smith=20
(1985, San Francisco: Jossey-Bass)
Richard Karash ("Rick") | (o) 508-879-8301 | Mac * Flying
Innovation Associates, Inc. | (fax) 508-626-2205 | Systems Thinking
3 Speen St, Framingham MA 01701 | firstname.lastname@example.org | Std. Disclaimer