Book Recommendation LO11333

JOE_PODOLSKY@HP-PaloAlto-om4.om.hp.com
Mon, 9 Dec 96 14:20:22 -0800

Replying to LO11310 --


I had the opportunity to attend a seminar lead by Nass and Reeves, and I
wrote this after that. It gives you a pretty good idea of the work they
describe in their book.

Joe

=======================================================
Joe's Jottings #52 (4/2/96)

Beware! This Device Is Exceptionally Friendly.
----------------------------------------------

City of Palo Alto Utilities Department gave me an "A". It says so right
on the monthly bill I got a few days ago. I earned my grade by paying my
bills on time. Now, I know full well that my "A" comes directly from a
computer program, untouched and unseen by any human being, but I still
feel a twinge of pleasure at this positive recognition of my
compulsiveness.

A few weeks ago, I went to a one-day seminar sponsored by HP Labs and HP
Quality entitled, "Social Responses to Computers and Other Electronic User
Products." The seminar was led by two Stanford University communications
and sociology professors, Clifford Nass and Byron Reeves. Their research
approach goes like this: 1) locate historically interesting sociology and
psychology experiments, and 2) duplicate the experiments substituting
computers and televisions for one of the participants.

For example, they found an experiment about how people tend to be polite
to each other. If one person delivers a presentation and then asks for
feedback, the comments to the presenter tend to be gracious. If, however,
a third person asks about the presentation, the comments tend to be
significantly less favorable.

Nass and Reeves had a computer give a presentation to some people. Then,
that same computer displayed a screen on which the audience could enter
feedback scores. The experiment was then duplicated, except this time the
audience was directed to a second computer that asked for feedback about
the presentation given by the first computer. Amazingly, the audience was
significantly more critical when giving answers to this second computer.
People were more polite to the presenter, even when the presenter was a
pile of plastic, sand, and software.

Needless to say, people denied that they made this distinction. It is a
completely unconscious reaction.

Even technical sophistication doesn't matter. The reaction was
statistically identical for audiences that were computer-illiterate and
for those who were computer experts.

Nass and Reeves have performed these experiments on over 20 different
categories of social behavior. The results are uniformly the same:
people treat media devices as though they were people. For example,
computers (and their programs) that were specifically assigned and
dedicated to a work team were considered by their human teammates to be
"more competent" than identical machines available in a pool and more
competent than identical computers assigned to other teams. The effect
was even more pronounced when the human teams were identified with colored
armbands and their computers were color-banded to match.

As shown by my grade from the Utilities company, it works for flattery
also. Sociological theory says that people like to hear good things about
themselves, and Nass and Reeves prove that they like to hear it from
computers as well as from people. Nass and Reeves suggest, for example,
that spell checking software might be improved if, besides finding
misspelled words, the software also praised us when we correctly spelled
tough words; e.g., "Congratulations! You got 'misspell' right. I have to
correct that word a lot!"

Nass and Reeves are, of course, writing a book on all this. It's titled
_The Media Equation: How People Treat Computers, Television, and New
Media Like Real People and Places_, and it will be in our neighborhood
book stores by September 1996.

HP is applying this to some of our instruments. Janice Bradford of HP
Labs, along with Stanford people and engineers from HP's Electronic
Measurements Division, studied the effects of changing the messages on one
of HP's mainstream oscilloscopes, the HP 54600 family. Using experienced
users as subjects, they tested the effects of different messages using
disguised but functionally identical units. For example, the old messages
were terse: "No active cursor," or "Delay at limit." The theory was that
engineers liked the short and direct communication. The experimenters
offered, as alternatives, complete sentences like these: "There is no
active cursor right now; activate one and try again." "You can't increase
the delay; it is now at the limit. Sorry."

Tests showed, as expected, that the conversational messages were viewed as
being "friendlier." But, totally unexpected, the user engineers viewed
the instruments with the wordy messages as being more "competent" than
those that gave only the terse responses. They thought that the polite
instruments were, in fact, more advanced in their technology as compared
to those with limited vocabulary. (Well, maybe they were, but it was in
technology of sociology, not in electrical engineering.)

I know really very little about all this, and it's easy to hastily
generalize from limited knowledge. But, right now, I have three reactions
to this.

First, let's do the obvious. We should write or acquire software that
delivers messages directly to users, whether on CRTs or on reports, in
conversational, polite, and friendly sentences. No more byte-saving,
incomprehensible "Error 63, process aborted" gibberish.

Second, we should write or acquire software that compliments our users. I
think I'd soon get tired of a spell-checker that complemented me, but I
feel good about Intuit's Turbo-Tax program that congratulates me as I
complete each section of their interactive interview. I'm not sure how I
feel about the grades from the Palo Alto Utilities computer. One part of
me wants to keep getting the "A," but I admit that I'm tempted to start
delaying my payments occasionally just to see what reactions are
programmed into that software. But, it's probably more trouble than it's
worth.

Lastly, the Nass and Reeves research reminds me of the value of the advise
given in Dale Carnegie's classic, _How to Win Friends and Influence
People_. Carnegie suggests simple, direct behaviors such as smiling at
people, seeking out something they have done well and complimenting them
about it, praising slight improvements, and so on. Carnegie's book has
been in continuous print since the 1930's for one good reason: the advice
works. Ideally, and Carnegie emphasizes this, we should all internalize
these behaviors and do them sincerely. Authentic behavior is far better
than insincere manipulation. But, Nass and Reeves work shows us that,
unconsciously, we respond even to obviously programmed inanimate devices,
and that is totally consistent with the fact that we respond also to the
scripts that Carnegie offers. Carnegie's advice is powerful stuff.

At a very deep level, Nass and Reeves's studies of technology tells us yet
again of our common and persistent humanity.

How do you feel about "behavior management" tools like this? Should our
new oscilloscopes have to carry a warning label, "Beware! This device has
been programmed to be exceptionally friendly."?

How do you feel about all this, either between machines and people, or
between people? I'd love to hear of examples of this type of work in
information technology. What was the reaction of users? What positive or
negative lessons have we learned from this systems like this?

Best regards,

Joe Podolsky
joe_podolsky@hp.com

-- 

JOE_PODOLSKY@HP-PaloAlto-om4.om.hp.com

Learning-org -- An Internet Dialog on Learning Organizations For info: <rkarash@karash.com> -or- <http://world.std.com/~lo/>