>Robert Axelrod has done a wonderfully accessible work on the
>development of trust "The Source of Cooperation" or some such title.
>He has advanced that work remarkable by develooping approaches with
>genetic algorithms looking at competing strategies.
While I have not yet read Axelrod ( and will add him to my list), I have
studied cooperation in the biological, cellular automata and game theory
domains. I think there is a distinction between cooperation and trust.
Whereas cooperation is a powerful strategy (in evolution, business and
game theory) it does not require trust. For example, a Tit-for-Tat
strategy is cooperative ( and very sucessful) but not trusting.
Trust is grounded in a history of conversations, it needs time (a history)
to emerge, wheras cooperation can be the first move/alternative in a
strategy for coupling. The constituitive elements for trust are, inter
alia, consistency, predictability, recurrence.
In organizational terms trust emerges form the recurrent predictable
couplings between people. Thus it emerges in language. Do all
recurrently sucessful conversations for action eventually engender trust
or is there something else, having to do with a congruence in the
disclosive spaces (mental models) of the participants, necessary? Or does
this congruence merely accelerate the emergence of trust?
Is trust linked only to conversations for action, or can it emerge out of
other types of coupling such as conversations for speculation?
What is the link between trust and authenticity? Can we have one without
Learning-org -- An Internet Dialog on Learning Organizations For info: <email@example.com> -or- <http://world.std.com/~lo/>