Important Announcement
PubHTML5 Scheduled Server Maintenance on (GMT) Sunday, June 26th, 2:00 am - 8:00 am.
PubHTML5 site will be inoperative during the times indicated!

Home Explore Social Pyschology

Social Pyschology

Published by Tasya Hamidah, 2022-04-05 15:41:51

Description: Myers’ scientific articles have appeared in some
three dozen scientific books and periodicals, including
Science, the American Scientist, Psychological Science, and the American Psychologist.
In addition to his scholarly writing and his textbooks, he communicates psychological science to the
general public. His writings have appeared in three
dozen magazines, from Today’s Education to Scientific
American. He also has published general audience
books, including The Pursuit of Happiness and Intui tion:
Its Powers and Perils.
David Myers has chaired his city’s Human Relations
Commission, helped found a thriving assistance center for families in poverty, and spoken to hundreds of
college and community groups. Drawing on his own
experience, he also has written articles and a book
(A Quiet World) about hearing loss, and he is advocating a revolution in American hearing- assistance technology (hearingloop.org).

Search

Read the Text Version

76 Part One Social Thinking Summing Up: Self-Presentation • Self-presentation refers to our wanting to present a favorable image both to an external audience • As social animals, we adjust our words and actions (other people) and to an internal audience (our- to suit our audiences. To varying degrees, we note selves). With regard to an external audience, those our performance and adjust it to create the impres- who score high on a scale of self-monitoring adjust sions we desire. their behavior to each situation, whereas those low in self-monitoring may do so little social adjusting • Such tactics explain examples of false modesty, in that they seem insensitive. which people put themselves down, extol future competitors, or publicly credit others while pri- vately crediting themselves. • Sometimes people will even self-handicap with self-defeating behaviors that protect self-esteem by providing excuses for failure. P.S. POSTSCRIPT: Twin Truths—The Perils of Pride, the Powers of Positive Thinking This chapter offered two memorable truths—the truth of self-efficacy and the truth of self-serving bias. The truth concerning self-efficacy encourages us not to resign ourselves to bad situations. We need to persist despite initial failures and to exert effort without being overly distracted by self-doubts. Secure self-esteem is likewise adaptive. When we believe in our positive possibilities, we are less vulnerable to depression and we feel less insecure. Thus, it’s important to think positively and try hard, but not to be so self-confi- dent that your goals are illusory or you alienate others with your narcissism. Taking self-efficacy too far leads to blaming the victim: If positive thinking can accomplish anything, then we have only ourselves to blame if we are unhappily married, poor, or depressed. For shame! If only we had tried harder, been more disciplined, less stupid. This viewpoint fails to acknowledge that bad things can happen to good people. Life’s greatest achievements, but also its greatest disappointments, are born of the highest expectations. These twin truths—self-efficacy and self-serving bias—remind me of what Pas- cal taught 300 years ago: No single truth is ever sufficient, because the world is complex. Any truth, separated from its complementary truth, is a half-truth. Making the Social Connection This chapter’s discussion of self and culture explored the idea of self- concept and suggested that the view of the self can be interdependent and/or independent; we also read Hazel Markus’s thoughts on the self and culture. Go to the Online Learning Center at www.mhhe.com/myers10e to watch videos on these topics.



SocialC H A P T E R 3 Beliefs and Judgments

Perceiving our social worlds Judging our social worlds Explaining our social worlds Expectations of our social worlds Conclusions Postscript: Reflecting on illusory thinking As U.S. senators, Republican John McCain and Democrat Barack Obama each adopted positions of apparent conscience. In 2001, McCain voted against President Bush’s proposed tax cut, say- ing “I cannot in good conscience support a tax cut in which so many of the benefits go to the most fortunate.” In 2008, when McCain was campaigning for the Republican nomination and then for president, he supported and favored extending the cuts that he earlier had opposed. Barack Obama in 2007 declared himself a “longtime advocate” of public financing of presidential elections and pledged to accept public financing should he win the Democratic nomination for president. But when he won the nomination, supported by unprecedented campaign contributions, he rejected public financing of his own campaign. For Democrats, McCain’s reversal displayed not moral courage and an openness to changing one’s mind in the light of new information, but rather expedience and hypocrisy as McCain sought to pick up contributions and votes from wealthy conservatives. For Republicans, Obama’s reversal likewise displayed not a tem- porary strategy en route to reforming election financing, but rather hypocrisy and the same old do-what-you-can-to-get-elected politics.

80 Part One Social Thinking As the candidates debated, most McCain partisans were impressed by the reason- ableness and power of his straight-talk arguments, while being underwhelmed by the force and cogency of Obama’s performance. Most Obama partisans experienced a mirror-image reaction, feeling cheered by what they perceived as their candidate’s superior charisma, intelligence, and vision. These differing reactions, which have been replicated in political perceptions across the world, illustrate the extent to which we construct social perceptions and beliefs as we • perceive and recall events through the filters of our own assumptions; • judge events, informed by our intuition, by implicit rules that guide our snap judgments, and by our moods; • explain events by sometimes attributing them to the situation, sometimes to the person; and • expect certain events, which sometimes helps bring them about. This chapter therefore explores how we perceive, judge, and explain our social worlds, and how—and to what extent—our expectations matter. priming Perceiving Our Social Worlds Activating particular associations in memory. Striking research reveals the extent to which our assumptions and prejudgments guide our perceptions, interpretations, and recall. Chapter 1 noted a significant fact about the human mind: our preconceptions guide how we perceive and interpret information. We construe the world through belief-tinted glasses. “Sure, preconceptions matter,” people will agree; yet they fail to realize how great the effect is. Let’s consider some provocative experiments. The first group of experiments examines how predispositions and prejudgments affect how we perceive and inter- pret information. The second group plants a judgment in people’s minds after they have been given information to see how after-the-fact ideas bias recall. The over- arching point: We respond not to reality as it is but to reality as we construe it. Priming Unattended stimuli can subtly influence how we interpret and recall events. Imagine yourself, during an experiment, wearing earphones and concentrating on ambigu- ous spoken sentences such as “We stood by the bank.” When a pertinent word (river or money) is simultaneously sent to your other ear, you don’t consciously hear it. Yet the word “primes” your interpretation of the sentence (Baars & McGovern, 1994). Our memory system is a web of associations, and priming is the awakening or activating of certain associations. Experiments show that priming one thought, even without awareness, can influence another thought, or even an action. John Bargh and his colleagues (1996) asked people to complete a sentence containing words such as “old,” “wise,” and “retired.” Shortly afterward, they observed these people

Social Beliefs and Judgments Chapter 3 81 walking more slowly to the elevator Posting the second sign than did those not primed with aging- may prime customers to be related words. Moreover, the slow dissatisfied with the handling walkers had no awareness of their wal- of their complaints at the first king speed or of having just viewed window. words that primed aging. www.CartoonStock.com Often our thinking and acting are subtly primed by unnoticed events. Rob Holland and his colleagues (2005) observed that Dutch students exposed to the scent of an all-purpose cleaner were quicker to identify cleaning- related words. In follow-up experi- ments, other students exposed to a cleaning scent recalled more cleaning- related activities when describing their day’s activities and even kept their desk cleaner while eating a crumbly cookie. Moreover, all these effects occurred without the participants’ conscious awareness of the scent and its influence. Priming experiments (Bargh, 2006) have their counterparts in everyday life: • Watching a scary movie alone at home can activate emotions that, without our realizing it, cause us to interpret furnace noises as a possible intruder. • Depressed moods, as this chapter explains later, prime negative associations. Put people in a good mood and suddenly their past seems more wonderful, their future brighter. • Watching violence primes people to interpret ambiguous actions (a shove) and words (“punch”) as aggressive. • For many psychology students, reading about psychological disorders primes how they interpret their own anxieties and gloomy moods. Reading about disease symptoms similarly primes medical students to worry about their congestion, fever, or headache. In a host of studies, priming effects surface even when the stimuli are presented subliminally—too briefly to be perceived consciously. What’s out of sight may not be completely out of mind. An electric shock that is too slight to be felt may increase the perceived intensity of a later shock. An imperceptibly flashed word, “bread,” may prime people to detect a related word such as “butter” more quickly than they detect an unrelated word such as “bottle” or “bubble.” A subliminal color name facilitates speedier identification when the color appears on the computer screen, whereas an unseen wrong name delays color identification (Epley & others, 1999; Merikle & others, 2001). In each case, an invisible image or word primes a response to a later task. Studies of how implanted ideas and images can prime our interpretations and recall illustrate one of this book’s take-home lessons from twenty-first-century social psychology: Much of our social information processing is automatic. It is uninten- tional, out of sight, and happens without our conscious awareness. Perceiving and Interpreting Events Despite some startling and oft-confirmed biases and logical flaws in how we perceive and understand one another, we’re mostly accurate (Jussim, 2005). Our first impressions of one another are more often right than wrong. Moreover, the better we know people, the more accurately we can read their minds and feelings.

82 Part One Social Thinking FIGURE :: 3.1 Perception of media bias Members of each side perceived Pro-Israeli and pro-Arab stu- Pro-Israel 9 bias against their view dents who viewed network news descriptions of the “Beirut mas- 8 sacre” believed the coverage was biased against their point 7 of view. 6 Source: Data from Vallone, Ross, & Lepper, 1985. Neutral 5 4 3 2 Anti-Israel 1 Pro-Israeli Pro-Arab students students “Once you have a belief, it But on occasion our prejudgments err. The effects of prejudgments and expecta- influences how you perceive tions are standard fare for psychology’s introductory course. Recall the Dalmatian all other relevant information. photo in Chapter 1. Or consider this phrase: Once you see a country as hostile, you are likely to inter- A pret ambiguous actions on BIRD their part as signifying their IN THE hostility.” THE HAND —POLITICAL SCIENTIST Did you notice anything wrong with it? There is more to perception than meets ROBERT JERVIS (1985) the eye. The same is true of social perception. Because social perceptions are very much in the eye of the beholder, even a simple stimulus may strike two people quite differently. Saying Britain’s Gordon Brown is “an okay prime minister” may sound like a put-down to one of his ardent admirers and like praise to someone who regards him with contempt. When social information is subject to multiple interpretations, preconceptions matter (Hilton & von Hippel, 1990). An experiment by Robert Vallone, Lee Ross, and Mark Lepper (1985) reveals just how powerful preconceptions can be. They showed pro-Israeli and pro-Arab stu- dents six network news segments describing the 1982 killing of civilian refugees at two camps in Lebanon. As Figure 3.1 illustrates, each group perceived the networks as hostile to its side. The phenomenon is commonplace: Sports fans perceive referees as partial to the other side. Political candidates and their supporters nearly always view the news media as unsympathetic to their cause (Richardson & others, 2008). In the 2008 U.S. presidential race, supporters of Hillary Clinton, Barack Obama, and John McCain all noted instances when the media seemed biased against their candidate, some- times because of seeming prejudice related to gender, race, or age. But it’s not just fans and politicians. People everywhere perceive mediators and media as biased against their position. “There is no subject about which people are less objective than objectivity,” noted one media commentator (Poniewozik, 2003). Indeed, people’s perceptions of bias can be used to assess their attitudes (Saucier & Miller, 2003). Tell me where you see bias, and you will signal your attitudes. Our assumptions about the world can even make contradictory evidence seem supportive. For example, Ross and Lepper assisted Charles Lord (1979) in asking two groups of students to evaluate the results of two supposedly new research studies. Half the students favored capital punishment and half opposed it. Of the

Social Beliefs and Judgments Chapter 3 83 studies they evaluated, one confirmed and the other disconfirmed the students’ beliefs about the deterrent effect of the death penalty. The results: Both propo- nents and opponents of capital punish- ment readily accepted evidence that con- firmed their belief but were sharply critical of disconfirming evidence. Showing the two sides an identical body of mixed evi- dence had not lessened their disagreement but increased it. Is that why, in politics, religion, and sci- ence, ambiguous information often fuels conflict? Presidential debates in the United States have mostly reinforced predebate opinions. By nearly a 10-to-1 margin, those who already favored one candidate or the other perceived their candidate as hav- ing won (Kinder & Sears, 1985). Thus, report Geoffrey Munro and his colleagues Some circumstances make it difficult to be unbiased. (1997), people on both sides may become © The New Yorker Collection, 2003, Alex Gregory, from cartoonbank.com. All Rights Reserved. even more supportive of their respective candidates after viewing a presidential debate. Moreover, at the end of the Republican presidency of Ronald Reagan (during which inflation fell), only 8 percent of Democrats perceived that inflation had fallen. Republicans—47 percent of whom correctly perceived that it had—were similarly inaccurate and negative in their perceptions at the end of the Democratic Clinton presidency (Brooks, 2004). Partisanship predisposes perceptions. In addition to these studies of people’s preexisting social and political attitudes, researchers have manipulated people’s preconceptions—with astonishing effects upon their interpretations and recollections. Myron Rothbart and Pamela Birrell (1977) had University of Oregon students “The error of our eye directs assess the facial expression of a man (Figure 3.2). Those told he was a Gestapo leader our mind: What error leads responsible for barbaric medical experiments on concentration camp inmates intui- tively judged his expression as cruel. (Can you see that barely suppressed sneer?) must err.” Those told he was a leader in the anti-Nazi underground movement whose courage saved thousands of Jewish lives judged his facial expression as warm and kind. —SHAKESPEARE, TROILUS AND CRESSIDA, 1601–1602 (Just look at those caring eyes and that almost smiling mouth.) Filmmakers control people’s percep- FIGURE :: 3.2 tions of emotion by manipulating the set- ting in which they see a face. They call Judge for yourself: Is this per- this the “Kulechov effect,” after a Russian son’s expression cruel or kind? film director who would skillfully guide If told he was a Nazi, would your viewers’ inferences by manipulating their reading of his face differ? assumptions. Kulechov demonstrated the phenomenon by creating three short films that presented identical footage of the face of an actor with a neutral expression after viewers had first been shown one of three different scenes: a dead woman, a dish of soup, or a girl playing. As a result, in the first film the actor seemed sad, in the sec- ond thoughtful, and in the third happy. Construal processes also color others’ perceptions of us. When we say some- thing good or bad about another, people

84 Part One Social Thinking spontaneously tend to associate that trait with us, report Lynda Mae, Donal Carlston, and John Skowronski (1999; Carlston & Skowronski, 2005)—a phe- nomenon they call spontaneous trait trans- ference. If we go around talking about others being gossipy, people may then unconsciously associate “gossip” with us. Call someone a jerk and folks may later construe you as one. Describe some- one as sensitive, loving, and compassion- ate, and you may seem more so. There is, it appears, intuitive wisdom in the childhood taunt, “I’m rubber, you’re glue; what you say bounces off me and sticks to you.” The bottom line: We view our social worlds through the spectacles of our beliefs, attitudes, and values. That is one reason our beliefs are so important; they shape our interpretation of everything else. Supporters of a particular candidate or cause tend to see the media as favoring Belief Perseverance the other side. Imagine a grandparent who decides, du- ring an evening with a crying infant, that bottle feeding produces colicky babies: “Come to think of it, cow’s milk obviously suits calves better than babies.” If the infant turns out to be suffering a high fever, will the sitter nevertheless persist in believing that bottle feeding causes colic (Ross & Anderson, 1982)? To find out, Lee Ross, Craig Anderson, and their colleagues planted a falsehood in people’s minds and then tried to discredit it. Their research reveals that it is surprisingly difficult to demolish a falsehood, once the person conjures up a rationale for it. Each experiment first implanted a belief, either by proclaiming it to be true or by showing the participants some anecdotal evidence. Then the participants were asked to explain why it is true. Finally, the researchers totally discredited the initial information by telling the participants the truth: The information was manufactured for the experiment, and half the partici- pants in the experiment had received opposite information. Nevertheless, the new belief survived about 75 percent intact, presumably because the participants still belief perseverance retained their invented explanations for the belief. This phenomenon, called belief Persistence of one’s initial perseverance, shows that beliefs can grow their own legs and survive the discredit- conceptions, as when the ing of the evidence that inspired them. basis for one’s belief is An example: Anderson, Lepper, and Ross (1980) asked participants to decide discredited but an expla- whether individuals who take risks make good or bad firefighters. One group con- nation of why the belief might sidered a risk-prone person who was a successful firefighter and a cautious person be true survives. who was unsuccessful. The other group considered cases suggesting the opposite conclusion. After forming their theory that risk-prone people make better or worse firefighters, the participants wrote explanations for it—for example, that risk-prone people are brave or that cautious people have fewer accidents. Once each explana- tion was formed, it could exist independently of the information that initially cre- ated the belief. When that information was discredited, the participants still held their self-generated explanations and therefore continued to believe that risk-prone people really do make better or worse firefighters. These experiments suggest that the more we examine our theories and explain how they might be true, the more closed we become to information that challenges

Social Beliefs and Judgments Chapter 3 85 our beliefs. Once we consider why an accused person might be guilty, why an “We hear and apprehend offending stranger acts that way, or why a favored stock might rise in value, our only what we already half explanations may survive challenges (Davies, 1997; Jelalian & Miller, 1984). know.” The evidence is compelling: Our beliefs and expectations powerfully affect how —HENRY DAVID THOREAU, we mentally construct events. Usually, we benefit from our preconceptions, just 1817–1862 as scientists benefit from creating theories that guide them in noticing and inter- preting events. But the benefits sometimes entail a cost: We become prisoners of “No one denies that new evi- our own thought patterns. Thus, the supposed Martian “canals” that twentieth- dence can change people’s century astronomers delighted in spotting turned out to be the product of intel- beliefs. Children do eventu- ligent life—an intelligence on Earth’s side of the telescope. As another example, ally renounce their belief in Germans, who widely believed that the introduction of the Euro currency led to Santa Claus. Our contention is increased prices, overestimated such price increases when comparing actual restau- simply that such changes gen- rant menus—the prior menu with German Mark prices and a new one with Euro erally occur slowly, and that prices (Traut-Mattausch & others, 2004). As an old Chinese proverb says, “Two- more compelling evidence is thirds of what we see is behind our eyes.” often required to alter a belief than to create it.” Belief perseverance may have important consequences, as Stephan Lewandowsky and his international collaborators (2005) discovered when they explored implanted —LEE ROSS & and discredited information about the Iraq war that began in 2003. As the war MARK LEPPER (1980) unfolded, the Western media reported and repeated several claims—for example, that Iraqi forces executed coalition prisoners of war—that later were shown to be false and were retracted. Alas, having accepted the information, which fit their pre- existing assumptions, Americans tended to retain the belief (unlike most Germans and Australians, who had questioned the war’s rationale). Is there a remedy for belief perseverance? There is: Explain the opposite. Charles Lord, Mark Lepper, and Elizabeth Preston (1984) repeated the capital punishment study described earlier and added two variations. First, they asked some of their participants, when evaluating the evidence, to be “as objective and unbiased as pos- sible.” That instruction accomplished nothing; whether for or against capital pun- ishment, those who had received the plea made evaluations as biased as those who had not. The researchers asked a third group to consider the opposite—to ask themselves “whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue.” After imagining an opposite finding, these people were much less biased in their evaluations of the evidence for and against their views. In his experiments, Craig Anderson (1982; Anderson & Sechler, 1986) consistently found that explaining why an opposite theory might be true—why a cautious rather than a risk-taking person might be a better firefighter—reduces or eliminates belief perseverance. Indeed, explaining any alternative outcome, not just the opposite, drives people to ponder various pos- sibilities (Hirt & Markman, 1995). Constructing Memories of Ourselves “Memory isn’t like reading and Our Worlds a book: It’s more like writ- ing a book from fragmentary Do you agree or disagree with this statement? notes.” Memory can be likened to a storage chest in the brain into which we deposit material —JOHN F. KIHLSTROM, 1994 and from which we can withdraw it later if needed. Occasionally, something is lost from the “chest,” and then we say we have forgotten. About 85 percent of college students said they agreed (Lamal, 1979). As one maga- zine ad put it, “Science has proven the accumulated experience of a lifetime is pre- served perfectly in your mind.” Actually, psychological research has proved the opposite. Our memories are not exact copies of experiences that remain on deposit in a memory bank. Rather, we construct memories at the time of withdrawal. Like a paleontologist inferring the appearance of a dinosaur from bone fragments, we reconstruct our distant past

86 Part One Social Thinking misinformation effect by using our current feelings and expectations to combine information fragments. Incorporating Thus, we can easily (though unconsciously) revise our memories to suit our current “misinformation” into knowledge. When one of my sons complained, “The June issue of Cricket never one’s memory of the event, came,” and was then shown where it was, he delightedly responded, “Oh good, I after witnessing an event knew I’d gotten it.” and receiving misleading information about it. When an experimenter or a therapist manipulates people’s presumptions about their past, a sizable percentage of people will construct false memories. Asked to “A man should never be imagine vividly a made-up childhood experience in which they ran, tripped, fell, ashamed to own that he has and stuck their hand through a window, or knocked over a punch bowl at a wed- been in the wrong, which is ding, about one-fourth will later recall the fictitious event as something that actually but saying in other words, happened (Loftus & Bernstein, 2005). In its search for truth, the mind sometimes that he is wiser today than he constructs a falsehood. was yesterday.” In experiments involving more than 20,000 people, Elizabeth Loftus (2003, 2007) —JONATHAN SWIFT, and her collaborators have explored our mind’s tendency to construct memories. THOUGHTS ON VARIOUS In the typical experiment, people witness an event, receive misleading informa- tion about it (or not), and then take a memory test. The repeated finding is the SUBJECTS, 1711 misinformation effect. People incorporate the misinformation into their memories: They recall a yield sign as a stop sign, hammers as screwdrivers, Vogue magazine as Mademoiselle, Dr. Henderson as “Dr. Davidson,” breakfast cereal as eggs, and a clean-shaven man as a fellow with a mustache. Suggested misinformation may even produce false memories of supposed child sexual abuse, argues Loftus. This process affects our recall of social as well as physical events. Jack Croxton and his colleagues (1984) had students spend 15 minutes talking with someone. Those who were later informed that this person liked them recalled the person’s behavior as relaxed, comfortable, and happy. Those informed that the person dis- liked them recalled the person as nervous, uncomfortable, and not so happy. RECONSTRUCTING OUR PAST ATTITUDES Five years ago, how did you feel about nuclear power? About your country’s presi- dent or prime minister? About your parents? If your attitudes have changed, what do you think is the extent of the change? Experimenters have explored such questions, and the results have been unnerv- ing. People whose attitudes have changed often insist that they have always felt much as they now feel. Daryl Bem and Keith McConnell (1970) conducted a sur- vey among Carnegie-Mellon University students. Buried in it was a question con- cerning student control over the university curriculum. A week later the students agreed to write an essay opposing student control. After doing so, their attitudes shifted toward greater opposition to student control. When asked to recall how they had answered the question before writing the essay, the students “remem- bered” holding the opinion that they now held and denied that the experiment had affected them. After observing Clark University students similarly denying their former atti- tudes, researchers D. R. Wixon and James Laird (1976) commented, “The speed, magnitude, and certainty” with which the students revised their own histories “was striking.” As George Vaillant (1977) noted after following adults through time, “It is all too common for caterpillars to become butterflies and then to maintain that in their youth they had been little butterflies. Maturation makes liars of us all.” The construction of positive memories brightens our recollections. Terence Mitchell, Leigh Thompson, and their colleagues (1994, 1997) report that people often exhibit rosy retrospection—they recall mildly pleasant events more favorably than they experienced them. College students on a three-week bike trip, older adults on a guided tour of Austria, and undergraduates on vacation all reported enjoying their experiences as they were having them. But they later recalled such experi- ences even more fondly, minimizing the unpleasant or boring aspects and remem- bering the high points. Thus, the pleasant times during which I have sojourned in Scotland I now (back in my office facing deadlines and interruptions) romanticize

Social Beliefs and Judgments Chapter 3 87 as pure bliss. The mist and the midges are but dim memories. The spectacular scen- “Travel is glamorous only in ery and the fresh sea air and the favorite tea rooms are still with me. With any posi- retrospect.” tive experience, some of our pleasure resides in the anticipation, some in the actual experience, and some in the rosy retrospection. —PAUL THEROUX, IN THE OBSERVER Cathy McFarland and Michael Ross (1985) found that as our relationships change, we also revise our recollections of other people. They had university students rate “Vanity plays lurid tricks with their steady dating partners. Two months later, they rated them again. Students our memory.” who were more in love than ever had a tendency to recall love at first sight. Those who had broken up were more likely to recall having recognized the partner as —NOVELIST JOSEPH somewhat selfish and bad-tempered. CONRAD, 1857–1924 Diane Holmberg and John Holmes (1994) discovered the phenomenon also oper- ating among 373 newlywed couples, most of whom reported being very happy. When resurveyed two years later, those whose marriages had soured recalled that things had always been bad. The results are “frightening,” say Holmberg and Holmes: “Such biases can lead to a dangerous downward spiral. The worse your current view of your partner is, the worse your memories are, which only further confirms your negative attitudes.” It’s not that we are totally unaware of how we used to feel, just that when memo- ries are hazy, current feelings guide our recall. When widows and widowers try to recall the grief they felt on their spouse’s death five years earlier, their current emo- tional state colors their memories (Safer & others, 2001). When patients recall their previous day’s headache pain, their current feelings sway their recollections (Eich & others, 1985). Parents of every generation bemoan the values of the next genera- tion, partly because they misrecall their youthful values as being closer to their cur- rent values. And teens of every generation recall their parents as—depending on their current mood—wonderful or woeful (Bornstein & others, 1991). RECONSTRUCTING OUR PAST BEHAVIOR Memory construction enables us to revise our own histories. The hindsight bias, described in Chapter 1, involves memory revision. Hartmut Blank and his col- leagues (2003) showed this when inviting University of Leipzig students, after a surprising German election outcome, to recall their voting predictions from two months previous. The students misrecalled their predictions as closer to the actual results. Our memories reconstruct other sorts of past behaviors as well. Michael Ross, Cathy McFarland, and Garth Fletcher (1981) exposed some University of Waterloo students to a message convincing them of the desirability of toothbrushing. Later, in a supposedly different experiment, these students recalled brushing their teeth more often during the preceding two weeks than did students who had not heard the message. Likewise, people who are surveyed report smoking many fewer ciga- rettes than are actually sold (Hall, 1985). And they recall casting more votes than were actually recorded (Census Bureau, 1993). Social psychologist Anthony Greenwald (1980) noted the similarity of such find- ings to happenings in George Orwell’s novel 1984—in which it was “necessary to remember that events happened in the desired manner.” Indeed, argued Green- wald, we all have “totalitarian egos” that revise the past to suit our present views. Thus, we underreport bad behavior and overreport good behavior. Sometimes our present view is that we’ve improved—in which case we may misrecall our past as more unlike the present than it actually was. This tendency resolves a puzzling pair of consistent findings: Those who participate in psycho- therapy and self-improvement programs for weight control, antismoking, and exercise show only modest improvement on average. Yet they often claim con- siderable benefit (Myers, 2010). Michael Conway and Michael Ross (1986) explain why: Having expended so much time, effort, and money on self-improvement, people may think, “I may not be perfect now, but I was worse before; this did me a lot of good.”

88 Part One Social Thinking In Chapter 14 we will see that psychiatrists and clinical psychologists are not immune to these human tendencies. We all selectively notice, interpret, and recall events in ways that sustain our ideas. Our social judgments are a mix of observa- tion and expectation, reason and passion. Summing Up: Perceiving Our Social Worlds • Our preconceptions strongly influence how we • Belief perseverance is the phenomenon in which peo- interpret and remember events. In a phenom- ple cling to their initial beliefs and the reasons why enon called priming, people’s prejudgments have a belief might be true, even when the basis for the striking effects on how they perceive and inter- belief is discredited. pret information. • Far from being a repository for facts about the • Other experiments have planted judgments or false past, our memories are actually formed when we ideas in people’s minds after they have been given retrieve them, and subject to strong influence by information. These experiments reveal that as before- the attitudes and feelings we hold at the time of the-fact judgments bias our perceptions and interpre- retrieval. tations, so after-the-fact judgments bias our recall. Judging Our Social Worlds As we have already noted, our cognitive mechanisms are efficient and adaptive, yet occasionally error-prone. Usually they serve us well. But sometimes clinicians misjudge patients, employers misjudge employees, people of one race misjudge peo- ple of another, and spouses misjudge their mates. The results can be misdiagnoses, labor strife, prejudices, and divorces. So, how—and how well—do we make intui- tive social judgments? When historians describe social psychology’s first century, they will surely record 1980–2010 as the era of social cognition. By drawing on advances in cogni- tive psychology—in how people perceive, represent, and remember events—social psychologists have shed welcome light on how we form judgments. Let’s look at what that research reveals of the marvels and mistakes of our social intuition. Intuitive Judgments What are our powers of intuition—of immediately knowing something without rea- soning or analysis? Advocates of “intuitive management” believe we should tune into our hunches. When judging others, they say, we should plug into the nonlogi- cal smarts of our “right brain.” When hiring, firing, and investing, we should listen to our premonitions. In making judgments, we should follow the example of Star Wars’ Luke Skywalker by switching off our computer guidance systems and trust- ing the force within. Are the intuitionists right that important information is immediately available apart from our conscious analysis? Or are the skeptics correct in saying that intu- ition is “our knowing we are right, whether we are or not”? Priming research suggests that the unconscious indeed controls much of our behavior. As John Bargh and Tanya Chartrand (1999) explain, “Most of a person’s everyday life is determined not by their conscious intentions and deliberate choices but by mental processes that are put into motion by features of the environment and that operate outside of conscious awareness and guidance.” When the light turns red, we react and hit the brake before consciously deciding to do so. Indeed, reflect Neil Macrae and Lucy Johnston (1998), “to be able to do just about anything at all (e.g., driving, dating, dancing), action initiation needs to be decoupled from the

Social Beliefs and Judgments Chapter 3 89 inefficient (i.e., slow, serial, resource-consuming) workings of the conscious mind, controlled processing otherwise inaction inevitably would prevail.” “Explicit” thinking that is deliberate, reflective, and THE POWERS OF INTUITION conscious. “The heart has its reasons which reason does not know,” observed seventeenth- automatic processing century philosopher-mathematician Blaise Pascal. Three centuries later, scientists “Implicit” thinking that is have proved Pascal correct. We know more than we know we know. Studies of our effortless, habitual, and unconscious information processing confirm our limited access to what’s going on without awareness, roughly in our minds (Bargh & Ferguson, 2000; Greenwald & Banaji, 1995; Strack & Deutsch, corresponds to “intuition.” 2004). Our thinking is partly controlled (reflective, deliberate, and conscious) and— more than psychologists once supposed—partly automatic (impulsive, effortless, and without our awareness). Automatic, intuitive thinking occurs not “on-screen” but off-screen, out of sight, where reason does not go. Consider these examples of automatic thinking: • Schemas are mental concepts or templates that intuitively guide our percep- tions and interpretations. Whether we hear someone speaking of religious sects or sex depends not only on the word spoken but also on how we auto- matically interpret the sound. • Emotional reactions are often nearly instantaneous, happening before there is time for deliberate thinking. One neural shortcut takes information from the eye or the ear to the brain’s sensory switchboard (the thalamus) and out to its emotional control center (the amygdala) before the thinking cortex has had any chance to intervene (LeDoux, 2002). Our ancestors who intuitively feared a sound in the bushes were usually fearing nothing. But when the sound was made by a dangerous predator they became more likely to survive to pass their genes down to us than their more deliberative cousins. • Given sufficient expertise, people may intuitively know the answer to a prob- lem. Master chess players intuitively recognize meaningful patterns that novices miss and often make their next move with only a glance at the board, as the situation cues information stored in their memory. Similarly, without knowing quite how, we recognize a friend’s voice after the first spoken word of a phone conversation. • Faced with a decision but lacking the expertise to make an informed snap judg- ment, our unconscious thinking may guide us toward a satisfying choice. That’s what University of Amsterdam psychologist Ap Dijksterhuis and his co-workers (2006a, 2006b) discovered after showing people, for example, a dozen pieces of information about each of four potential apartments. Compared to people who made instant decisions or were given time to analyze the information, the most satisfying decisions were made by those who were distracted and unable to focus consciously on the problem. Although these findings are controversial (González-Vallejo & others, 2008; Newell & others, 2008), this much seems true: When facing a tough decision it often pays to take our time—even to sleep on it—and to await the intuitive result of our out-of-sight information processing. Some things—facts, names, and past experiences—we remember explicitly (con- sciously). But other things—skills and conditioned dispositions—we remember implicitly, without consciously knowing or declaring that we know. It’s true of us all but most strikingly evident in people with brain damage who cannot form new explicit memories. One such person never could learn to recognize her physician, who would need to reintroduce himself with a handshake each day. One day the physician affixed a tack to his hand, causing the patient to jump with pain. When the physician next returned, he was still unrecognized (explicitly). But the patient, retaining an implicit memory, would not shake his hand. Equally dramatic are the cases of blindsight. Having lost a portion of the visual cortex to surgery or stroke, people may be functionally blind in part of their field of

90 Part One Social Thinking vision. Shown a series of sticks in the blind field, they report seeing nothing. After correctly guessing whether the sticks are vertical or horizontal, the patients are astounded when told, “You got them all right.” Like the patient who “remembered” the painful handshake, these people know more than they know they know. Consider your own taken-for-granted capacity to recognize a face. As you look at it, your brain breaks the visual information into subdimensions such as color, depth, movement, and form and works on each aspect simultaneously before reas- sembling the components. Finally, using automatic processing, your brain com- pares the perceived image with previously stored images. Voilà! Instantly and effortlessly, you recognize your grandmother. If intuition is immediately knowing something without reasoned analysis, then perceiving is intuition par excellence. Subliminal stimuli may, as we have already noted, prime our thinking and react- ing. Shown certain geometric figures for less than 0.01 second each, people may deny having seen anything more than a flash of light yet express a preference for the forms they saw. So, many routine cognitive functions occur automatically, unintentionally, with- out awareness. We might remember how automatic processing helps us get through life by picturing our minds as functioning like big corporations. Our CEO—our con- trolled consciousness—attends to many of the most important, complex, and novel issues, while subordinates deal with routine affairs and matters requiring instant action. This delegation of resources enables us to react to many situations quickly and efficiently. The bottom line: Our brain knows much more than it tells us. THE LIMITS OF INTUITION We have seen how automatic, intuitive thinking can “make us smart” (Gigerenzer, 2007). Elizabeth Loftus and Mark Klinger (1992) nevertheless speak for other cogni- tive scientists in having doubts about the brilliance of intuition. They report “a gen- eral consensus that the unconscious may not be as smart as previously believed.” For example, although subliminal stimuli can trigger a weak, fleeting response— enough to evoke a feeling if not conscious awareness—there is no evidence that commercial subliminal tapes can “reprogram your unconscious mind” for success. In fact, a significant body of evidence indicates that they can’t (Greenwald, 1992). Social psychologists have explored not only our error-prone hindsight judgments but also our capacity for illusion—for perceptual misinterpretations, fantasies, and constructed beliefs. Michael Gazzaniga (1992, 1998, 2008) reports that patients whose brain hemispheres have been surgically separated will instantly fabricate— and believe—explanations of their own puzzling behaviors. If the patient gets up and takes a few steps after the experimenter flashes the instruction “walk” to the patient’s nonverbal right hemisphere, the verbal left hemisphere will instantly pro- vide the patient with a plausible explanation (“I felt like getting a drink”). Illusory thinking also appears in the vast new literature on how we take in, store, and retrieve social information. As perception researchers study visual illusions for what they reveal about our normal perceptual mechanisms, social psychologists study illusory thinking for what it reveals about normal information processing. These researchers want to give us a map of everyday social thinking, with the haz- ards clearly marked. As we examine some of these efficient thinking patterns, remember this: Dem- onstrations of how people create counterfeit beliefs do not prove that all beliefs are counterfeit (though, to recognize counterfeiting, it helps to know how it’s done). Overconfidence So far we have seen that our cognitive systems process a vast amount of informa- tion efficiently and automatically. But our efficiency has a trade-off; as we inter- pret our experiences and construct memories, our automatic intuitions sometimes err. Usually, we are unaware of our flaws. The “intellectual conceit” evident in

Social Beliefs and Judgments Chapter 3 91 judgments of past knowledge (“I knew it all along”) extends to estimates of current DOONESBURY © 2000 G. B. Trudeau. knowledge and predictions of future behavior. We know we’ve messed up in the Reprinted with permission of Universal past. But we have more positive expectations for our future performance in meet- Press Syndicate. All rights reserved. ing deadlines, managing relationships, following an exercise routine, and so forth (Ross & Newby-Clark, 1998). overconfidence phenomenon To explore this overconfidence phenomenon, Daniel Kahneman and Amos The tendency to be more Tversky (1979) gave people factual statements and asked them to fill in the blanks, confident than correct—to as in the following sentence: “I feel 98 percent certain that the air distance between overestimate the accuracy of New Delhi and Beijing is more than______ miles but less than______ miles.” Most one’s beliefs. individuals were overconfident: About 30 percent of the time, the correct answers lay outside the range they felt 98 percent confident about. The air distance between New Delhi and Beijing is To find out whether overconfidence extends to social judgments, David Dunning 2,500 miles. and his associates (1990) created a little game show. They asked Stanford Univer- sity students to guess a stranger’s answers to a series of questions, such as “Would you prepare for a difficult exam alone or with others?” and “Would you rate your lecture notes as neat or messy?” Knowing the type of question but not the actual questions, the participants first interviewed their target person about background, hobbies, academic interests, aspirations, astrological sign—anything they thought might be helpful. Then, while the targets privately answered 20 of the two-choice questions, the interviewers predicted their target’s answers and rated their own confidence in the predictions. The interviewers guessed right 63 percent of the time, beating chance by 13 percent. But, on average, they felt 75 percent sure of their predictions. When guessing their own roommates’ responses, they were 68 percent correct and 78 percent confident. More- over, the most confident people were most likely to be overconfident. People also are markedly overconfident when judging whether someone is telling the truth or when estimating things such as the sexual history of their dating partner or the activity pref- erences of their roommates (DePaulo & others, 1997; Swann & Gill, 1997). Ironically, incompetence feeds overconfidence. It takes competence to recognize what competence is, note Justin Kruger and David Dunning (1999). Students who score at the bottom on tests of grammar, humor, and logic are most prone to overes- timating their gifts at such. Those who don’t know what good logic or grammar is are often unaware that they lack it. If you make a list of all the words you can form out of the letters in “psychology,” you may feel brilliant—but then stupid when

92 Part One Social Thinking “The wise know too well their a friend starts naming the ones you missed. Deanna Caputo and Dunning (2005) weakness to assume infal- recreated this phenomenon in experiments, confirming that our ignorance of our libility; and he who knows ignorance sustains our self-confidence. Follow-up studies indicate that this “igno- most, knows best how little rance of one’s incompetence” occurs mostly on relatively easy-seeming tasks, such he knows.” as forming words out of “psychology.” On really hard tasks, poor performers more often appreciate their lack of skill (Burson & others, 2006). —THOMAS JEFFERSON, WRITINGS Ignorance of one’s incompetence helps explain David Dunning’s (2005) startling conclusion from employee assessment studies that “what others see in us . . . tends Regarding the atomic bomb: to be more highly correlated with objective outcomes than what we see in our- “That is the biggest fool selves.” In one study, participants watched someone walk into a room, sit, read a thing we have ever done. weather report, and walk out (Borkenau & Liebler, 1993). Based on nothing more The bomb will never go off, than that, their estimate of the person’s intelligence correlated with the person’s and I speak as an expert in intelligence score about as well as did the person’s own self-estimate (.30 vs. .32)! explosives.” If ignorance can beget false confidence, then—yikes!—where, we may ask, are you —ADMIRAL WILLIAM LEAHY TO and I unknowingly deficient? PRESIDENT TRUMAN, 1945 In Chapter 2 we noted that people overestimate their long-term emotional responses to good and bad happenings. Are people better at predicting their own behavior? To find out, Robert Vallone and his colleagues (1990) had college students predict in September whether they would drop a course, declare a major, elect to live off campus next year, and so forth. Although the students felt, on average, 84 percent sure of those self-predictions, they were wrong nearly twice as often as they expected to be. Even when feeling 100 percent sure of their predictions, they erred 15 percent of the time. In estimating their chances for success on a task, such as a major exam, people’s confidence runs highest when the moment of truth is off in the future. By exam day, the possibility of failure looms larger and confidence typically drops (Gilovich & oth- ers, 1993; Shepperd & others, 2005). Roger Buehler and his colleagues (1994, 2002, 2003, 2005) report that most students also confidently underestimate how long it will take them to complete papers and other major assignments. They are not alone: • The “planning fallacy.” How much free time do you have today? How much free time do you expect you will have a month from today? Most of us overestimate how much we’ll be getting done, and therefore how much free time we will have (Zauberman & Lynch, 2005). Professional planners, too, routinely underestimate the time and expense of projects. In 1969, Montreal Mayor Jean Drapeau proudly announced that a $120 million stadium with a retractable roof would be built for the 1976 Olympics. The roof was com- pleted in 1989 and cost $120 million by itself. In 1985, officials estimated that Boston’s “Big Dig” highway project would cost $2.6 billion and take until 1998. The cost ballooned to $14.6 billion and the project took until 2006. • Stockbroker overconfidence. Investment experts market their services with the confident presumption that they can beat the stock market average, forget- ting that for every stockbroker or buyer saying “Sell!” at a given price, there is another saying “Buy!” A stock’s price is the balance point between those mutually confident judgments. Thus, incredible as it may seem, economist Burton Malkiel (2007) reports that mutual fund portfolios selected by invest- ment analysts have not outperformed randomly selected stocks. • Political overconfidence. Overconfident decision makers can wreak havoc. It was a confident Adolf Hitler who from 1939 to 1945 waged war against the rest of Europe. It was a confident Lyndon Johnson who in the 1960s invested U.S. weapons and soldiers in the effort to salvage democracy in South Vietnam. It was a confident Saddam Hussein who in 1990 marched his army into Kuwait and in 2003 promised to defeat invading armies. It was a confi- dent George W. Bush who proclaimed that peaceful democracy would soon prevail in a liberated and thriving Iraq, with its alleged weapons of mass destruction newly destroyed.

Social Beliefs and Judgments Chapter 3 93 What produces overconfidence? Why doesn’t experience lead us to a more real- istic self-appraisal? For one thing, people tend to recall their mistaken judgments as times when they were almost right. Philip Tetlock (1998, 1999, 2005) observed this after inviting various academic and gov- ernment experts to project—from their viewpoint in the late 1980s—the future governance of the Soviet Union, South Africa, and Canada. Five years later com- munism had collapsed, South Africa had become a multiracial democracy, and Canada’s French-speaking minority had not seceded. Experts who had felt more than 80 percent confident were right in predicting these turns of events less than 40 percent of the time. Yet, reflecting on their judgments, those who erred believed they were still basically right. I was “almost right,” said many. “The hardliners almost succeeded in their coup attempt against Gorbachev.” “The Quebecois separatists almost won the secessionist referendum.” “But for the coincidence of de Klerk and Mandela, there would have been a much bloodier transition to black majority rule President George W. Bush after the American invasion of Iraq. Overconfidence, as in South Africa.” The Iraq war was a good exhibited by presidents and prime ministers who have committed troops to failed idea, just badly executed, excused many wars, underlies many blunders. of those who had supported it. Among political experts—and also stock market forecasters, mental health workers, and “When you know a thing, sports prognosticators—overconfidence is hard to dislodge. to hold that you know CONFIRMATION BIAS it; and when you do not know a thing, to allow that People also tend not to seek information that might disprove what they believe. you do not know it; this is P. C. Wason (1960) demonstrated this, as you can, by giving participants a sequence knowledge.” of three numbers—2, 4, 6—that conformed to a rule he had in mind. (The rule was simply any three ascending numbers.) To enable the participants to discover the rule, —CONFUCIUS, ANALECTS Wason invited each person to generate additional sets of three numbers. Each time, Wason told the person whether or not the set conformed to his rule. As soon as par- confirmation bias ticipants were sure they had discovered the rule, they were to stop and announce it. A tendency to search for information that confirms The result? Seldom right but never in doubt: 23 of the 29 participants convinced one’s preconceptions. themselves of a wrong rule. They typically formed some erroneous belief about the rule (for example, counting by twos) and then searched for confirming evidence (for example, by testing 8, 10, 12) rather than attempting to disconfirm their hunches. We are eager to verify our beliefs but less inclined to seek evidence that might disprove them, a phenomenon called the confirmation bias. Confirmation bias helps explain why our self-images are so remarkably stable. In experiments at the University of Texas at Austin, William Swann and Stephen Read (1981; Swann & others, 1992a, 1992b, 2007) discovered that students seek, elicit, and recall feedback that confirms their beliefs about themselves. People seek as friends and spouses those who bolster their own self views—even if they think poorly of themselves (Swann & others, 1991, 2003). Swann and Read (1981) liken this self-verification to how someone with a domi- neering self-image might behave at a party. Upon arriving, the person seeks those guests whom she knows will acknowledge her dominance. In conversation she then

94 Part One Social Thinking heuristic presents her views in ways that elicit the respect she expects. After the party, she A thinking strategy that has trouble recalling conversations in which her influence was minimal and more enables quick, efficient easily recalls her persuasiveness in the conversations that she dominated. Thus, her judgments. experience at the party confirms her self-image. REMEDIES FOR OVERCONFIDENCE What lessons can we draw from research on overconfidence? One lesson is to be wary of other people’s dogmatic statements. Even when people are sure they are right, they may be wrong. Confidence and competence need not coincide. Three techniques have successfully reduced the overconfidence bias. One is prompt feedback (Lichtenstein & Fischhoff, 1980). In everyday life, weather forecast- ers and those who set the odds in horse racing both receive clear, daily feedback. And experts in both groups do quite well at estimating their probable accuracy (Fischhoff, 1982). To reduce “planning fallacy” overconfidence, people can be asked to unpack a task—to break it down into its subcomponents—and estimate the time required for each. Justin Kruger and Matt Evans (2004) report that doing so leads to more realis- tic estimates of completion time. When people think about why an idea might be true, it begins to seem true (Koehler, 1991). Thus, a third way to reduce overconfidence is to get people to think of one good reason why their judgments might be wrong; that is, force them to consider disconfirming information (Koriat & others, 1980). Managers might fos- ter more realistic judgments by insisting that all proposals and recommendations include reasons why they might not work. Still, we should be careful not to undermine people’s reasonable self-confidence or to destroy their decisiveness. In times when their wisdom is needed, those lack- ing self-confidence may shrink from speaking up or making tough decisions. Over- confidence can cost us, but realistic self-confidence is adaptive. Heuristics: Mental Shortcuts With precious little time to process so much information, our cognitive system is fast and frugal. It specializes in mental shortcuts. With remarkable ease, we form impressions, make judgments, and invent explanations. We do so by using heuristics—simple, efficient thinking strategies. Heuristics enable us to live and make routine decisions with minimal effort (Shah & Oppenheimer, 2008). In most situations, our snap generalizations—“That’s dangerous!”—are adaptive. The speed of these intuitive guides promotes our survival. The biological purpose of thinking is less to make us right than to keep us alive. In some situations, however, haste makes error. THE REPRESENTATIVENESS HEURISTIC University of Oregon students were told that a panel of psychologists interviewed a sample of 30 engineers and 70 lawyers and summarized their impressions in thumbnail descriptions. The following description, they were told, was drawn at random from the sample of 30 engineers and 70 lawyers: Twice divorced, Frank spends most of his free time hanging around the country club. His clubhouse bar conversations often center around his regrets at having tried to fol- low his esteemed father’s footsteps. The long hours he had spent at academic drudg- ery would have been better invested in learning how to be less quarrelsome in his rela- tions with other people. Question: What is the probability that Frank is a lawyer rather than an engineer? Asked to guess Frank’s occupation, more than 80 percent of the students surmised he was one of the lawyers (Fischhoff & Bar-Hillel, 1984). Fair enough. But how do

Social Beliefs and Judgments Chapter 3 95 you suppose those estimates changed when the sample description was given to representativeness another group of students, modified to say that 70 percent were engineers? Not in heuristic the slightest. The students took no account of the base rate of engineers and law- The tendency to presume, yers; in their minds Frank was more representative of lawyers, and that was all that sometimes despite contrary seemed to matter. odds, that someone or something belongs to a To judge something by intuitively comparing it to our mental representation of particular group if resembling a category is to use the representativeness heuristic. Representativeness (typical- (representing) a typical ness) usually is a reasonable guide to reality. But, as we saw with “Frank” above, it member. doesn’t always work. Consider Linda, who is 31, single, outspoken, and very bright. She majored in philosophy in college. As a student she was deeply concerned with availability heuristic discrimination and other social issues, and she participated in antinuclear demon- A cognitive rule that judges strations. Based on that description, would you say it is more likely that the likelihood of things in terms of their availability a. Linda is a bank teller. in memory. If instances of something come readily to b. Linda is a bank teller and active in the feminist movement. mind, we presume it to be commonplace. Most people think b is more likely, partly because Linda better represents their image of feminists (Mellers & others, 2001). But ask yourself: Is there a better chance that Linda is both a bank teller and a feminist than that she’s a bank teller (whether feminist or not)? As Amos Tversky and Daniel Kahneman (1983) reminded us, the conjunction of two events cannot be more likely than either one of the events alone. THE AVAILABILITY HEURISTIC Consider the following: Do more people live in Iraq or in Tanzania? (See page 96.) You probably answered according to how readily Iraqis and Tanzanians come to mind. If examples are readily available in our memory—as Iraqis tend to be—then we presume that other such examples are commonplace. Usually this is true, so we are often well served by this cognitive rule, called the availability heu- ristic (Table 3.1). Said simply, the more easily we recall something, the more likely it seems. But sometimes the rule deludes us. If people hear a list of famous people of one sex (Jennifer Lopez, Venus Williams, Hillary Clinton) intermixed with an equal- size list of unfamous people of the other sex (Donald Scarr, William Wood, Mel Jasper), the famous names will later be more cognitively available. Most people will subsequently recall having heard more (in this instance) women’s names (McKelvie, 1995, 1997; Tversky & Kahneman, 1973). Vivid, easy-to-imagine events, such as shark attacks or diseases with easy-to-picture symptoms, may likewise seem more likely to occur than harder-to-picture events (MacLeod & Campbell, 1992; Sherman & others, 1985). TABLE :: 3.1 Fast and Frugal Heuristics Heuristic Definition Example But May Lead to Representativeness Snap judgments of Discounting whether someone Deciding that other important Availability or something fits a Carlos is a librar- information category ian rather than a trucker because he Overweighting Quick judgments of better represents vivid instances likelihood of events one’s image of and thus, for (how available in librarians example, fearing memory) the wrong things Estimating teen violence after school shootings

96 Part One Social Thinking “Most people reason dramat- Even fictional happenings in novels, television, and movies leave images that ically, not quantitatively.” later penetrate our judgments (Gerrig & Prentice, 1991; Green & others, 2002; Mar & Oatley, 2008). The more absorbed and “transported” the reader (“I could easily —JURIST OLIVER WENDELL picture the events”), the more the story affects the reader’s later beliefs (Diekman HOLMES, JR., 1841–1935 & others, 2000). Readers who are captivated by romance novels, for example, may gain readily available sexual scripts that influence their own sexual attitudes and Answer to Question on behaviors. page 95: Tanzania’s 40 million people Our use of the availability heuristic highlights a basic principle of social think- greatly outnumber Iraq’s ing: People are slow to deduce particular instances from a general truth, but they 28 million. Most people, are remarkably quick to infer general truth from a vivid instance. No wonder that having more vivid images of after hearing and reading stories of rapes, robberies, and beatings, 9 out of 10 Cana- Iraqis, guess wrong. dians overestimated—usually by a considerable margin—the percentage of crimes that involved violence (Doob & Roberts, 1988). And no wonder that South Africans, after a series of headline-grabbing gangland robberies and slayings, estimated that violent crime had almost doubled between 1998 and 2004, when actually it had decreased substantially (Wines, 2005). The availability heuristic explains why powerful anecdotes can nevertheless be more compelling than statistical information and why perceived risk is therefore often badly out of joint with real risks (Allison & others, 1992). We fret over swine flu (H1N1) but don’t bother to get vaccinated for common flu, which kills tens of thousands. We fret over extremely rare child abduction, even if we don’t buckle our children in the backseat. We fear terrorism, but are indifferent to global climate change—“Armageddon in slow motion.” In short, we worry about remote possibil- ities while ignoring higher probabilities, a phenomenon that Cass Sunstein (2007b) calls our “probability neglect.” Because news footage of airplane crashes is a readily available memory for most of us—especially since September 11, 2001—we often suppose we are more at risk traveling in commercial airplanes than in cars. Actually, from 2003 to 2005, U.S. travelers were 230 times more likely to die in a car crash than on a commercial flight covering the same distance (National Safety Council, 2008). In 2006, reports the Flight Safety Foundation, there was one airliner accident for every 4.2 million flights by Western-built commercial jets (Wald, 2008). For most air travelers, the most dangerous part of the journey is the drive to the airport. Shortly after 9/11, as many people abandoned air travel and took to the roads, I estimated that if Americans flew 20 percent less and instead drove those unflown miles, we could expect an additional 800 traffic deaths in the ensuing year (Myers, 2001). It took a curious German researcher (why didn’t I think of this?) to check that prediction against accident data, which confirmed an excess of some 350 deaths in the last three months of 2001 compared with the three-month average in the pre- ceding five years (Gigerenzer, 2004). The 9/11 terrorists appear to have killed more people unnoticed—on America’s roads—than they did with the 266 fatalities on those four planes. By now it is clear that our naive statistical intuitions, and our resulting fears, are driven not by calculation and reason but by emotions attuned to the availability heuristic. After this book is published, there likely will be another dramatic natural or terrorist event, which will again propel our fears, vigilance, and resources in a new direction. Terrorists, aided by the media, may again achieve their objective of capturing our attention, draining our resources, and distracting us from the mun- dane, undramatic, insidious risks that, over time, devastate lives, such as the rota- virus that each day claims the equivalent of four 747s filled with children (Parashar & others, 2006). But then again, dramatic events can also serve to awaken us to real risks. That, say some scientists, is what happened when hurricanes Katrina and Rita in 2005 began to raise concern that global warming, by raising sea levels and spawning extreme weather, is destined to become nature’s own weapon of mass destruction.

Social Beliefs and Judgments Chapter 3 97 Vivid, memorable—and therefore cognitively available—events influence our perception of the social world. The resulting “probability neglect” often leads people to fear the wrong things, such as fearing flying or terrorism more than smoking, driving, or climate change. If four jumbo jets filled with children crashed every day—approximating the number of childhood diarrhea deaths resulting from the rotavirus—something would have been done about it. Illustration by Dave Bohn. Counterfactual Thinking “Testimonials may be more compelling than mountains Easily imagined (cognitively available) events also influence our experiences of of facts and figures (as moun- guilt, regret, frustration, and relief. If our team loses (or wins) a big game by one tains of facts and figures in point, we can easily imagine how the game might have gone the other way, and social psychology so compel- thus we feel regret (or relief). Imagining worse alternatives helps us feel better. lingly demonstrate).” Imagining better alternatives, and pondering what we might do differently next time, helps us prepare to do better in the future (Epstude & Roese, 2008). —MARK SNYDER (1988) In Olympic competition, athletes’ emotions after an event reflect mostly how counterfactual thinking they did relative to expectations, but also their counterfactual thinking—their Imagining alternative mentally simulating what might have been (McGraw & others, 2005; Medvec & others, scenarios and outcomes that 1995). Bronze medalists (for whom an easily imagined alternative was finishing might have happened, but without a medal) exhibited more joy than silver medalists (who could more easily didn’t. imagine having won the gold). On the medal stand, it has been said, happiness is as simple as 1-3-2. Similarly, the higher a student’s score within a grade category (such as Bϩ), the worse they feel (Medvec & Savitsky, 1997). The Bϩstudent who misses an AϪby a point feels worse than the Bϩstudent who actually did worse and just made a Bϩby a point. Such counterfactual thinking occurs when we can easily picture an alternative outcome (Kahneman & Miller, 1986; Markman & McMullen, 2003): • If we barely miss a plane or a bus, we imagine making it if only we had left at our usual time, taken our usual route, not paused to talk. If we miss our con- nection by a half hour or after taking our usual route, it’s harder to simulate a different outcome, so we feel less frustration. • If we change an exam answer, then get it wrong, we will inevitably think “If only . . .” and will vow next time to trust our immediate intuition—although, contrary to student lore, answer changes are more often from incorrect to correct (Kruger & others, 2005). • The team or the political candidate that barely loses will simulate over and over how they could have won (Sanna & others, 2003).

98 Part One Social Thinking Counterfactual thinking underlies our feel- ings of luck. When we have barely escaped a bad event—avoiding defeat with a last-minute goal or standing nearest a falling icicle—we easily imagine a negative counterfactual (losing, being hit) and therefore feel “good luck” (Teigen & others, 1999). “Bad luck” refers to bad events that did happen but easily might not have. The more significant the event, the more intense the counterfactual thinking (Roese & Hur, 1997). Bereaved people who have lost a spouse or a child in a vehicle accident, or a child to sudden infant death syndrome, commonly report replay- ing and undoing the event (Davis & others, 1995, 1996). One friend of mine survived a head-on collision with a drunk driver that killed his wife, daughter, and mother. He recalled, “For months I turned the events of that day over and over in my mind. I kept reliving the day, changing the order of events so that the accident wouldn’t occur” (Sittser, 1994). Across Asian and Western cultures most peo- ple, however, live with less regret over things done than over things they failed to do, such as, “I wish I had been more serious in college” or “I Counterfactual thinking: When Deal or No Deal game show contestants should have told my father I loved him before he deal too late (walking away with a lower amount than they were died” (Gilovich & Medvec, 1994; Rajagopal & oth- previously offered) or too early (foregoing their next choice, which would ers, 2006). In one survey of adults, the most com- have led to more money) they likely experience counterfactual thinking— mon regret was not taking their education more imagining what might have been. seriously (Kinnier & Metha, 1989). Would we live People are more often with less regret if we dared more often to reach apologetic about actions than beyond our comfort zone—to venture out, risking failure, but at least having tried? inactions (Zeelenberg & Illusory Thinking others, 1998). Another influence on everyday thinking is our search for order in random events, a tendency that can lead us down all sorts of wrong paths. illusory correlation ILLUSORY CORRELATION Perception of a relationship where none exists, or It’s easy to see a correlation where none exists. When we expect to find signifi- perception of a stronger cant relationships, we easily associate random events, perceiving an illusory cor- relationship than actually relation. William Ward and Herbert Jenkins (1965) showed people the results of exists. a hypothetical 50-day cloud-seeding experiment. They told participants which of the 50 days the clouds had been seeded and which days it rained. That information was nothing more than a random mix of results: Sometimes it rained after seeding; sometimes it didn’t. Participants nevertheless became convinced—in conformity with their ideas about the effects of cloud seeding—that they really had observed a relationship between cloud seeding and rain. Other experiments confirm that people easily misperceive random events as con- firming their beliefs (Crocker, 1981; Jennings & others, 1982; Trolier & Hamilton, 1986). If we believe a correlation exists, we are more likely to notice and recall con- firming instances. If we believe that premonitions correlate with events, we notice and remember the joint occurrence of the premonition and the event’s later occur- rence. If we believe that overweight women are unhappier, we perceive that we have witnessed such a correlation even when we have not (Viken & others, 2005). We seldom notice or remember all the times unusual events do not coincide. If, after

Social Beliefs and Judgments Chapter 3 99 we think about a friend, the friend calls us, we notice and remem- ber that coincidence. We don’t notice all the times we think of a friend without any ensuing call or receive a call from a friend about whom we’ve not been thinking. ILLUSION OF CONTROL illusion of control Perception of uncontrollable Our tendency to perceive random events as subject to events as related feeds an illusion one’s control or as more of control—the idea that chance controllable than they are. events are subject to our influence. This keeps gamblers going and FAMILY CIRCUS © Bil Keane, Inc. King makes the rest of us do all sorts of Features Syndicate. unlikely things. regression toward the GAMBLING Ellen Langer average The statistical tendency for (1977) demonstrated the illusion extreme scores or extreme behavior to return toward of control with experiments on one’s average. gambling. Compared with those given an assigned lottery num- ber, people who chose their own number demanded four times as much money when asked if they would sell their ticket. When playing a game of chance against an awkward and nervous person, they bet significantly more than when playing against a dapper, confident oppo- nent. Being the person who throws the dice or spins the wheel increases people’s confidence (Wohl & Enzle, 2002). In these and other ways, more than 50 experi- ments have consistently found people acting as if they can predict or control chance events (Presson & Benassi, 1996; Thompson & others, 1998). Observations of real-life gamblers confirm these experimental findings. Dice players may throw softly for low numbers and hard for high numbers (Henslin, 1967). The gambling industry thrives on gamblers’ illusions. Gamblers attribute wins to their skill and foresight. Losses become “near misses” or “flukes,” or for the sports gambler, a bad call by the referee or a freakish bounce of the ball (Gilovich & Douglas, 1986). Stock traders also like the “feeling of empowerment” that comes from being able to choose and control their own stock trades, as if their being in control can enable them to outperform the market average. One ad declared that online investing “is about control.” Alas, the illusion of control breeds overconfidence and frequent losses after stock market trading costs are subtracted (Barber & Odean, 2001). REGRESSION TOWARD THE AVERAGE Tversky and Kahneman (1974) noted another way by which an illusion of control may arise: We fail to recognize the sta- tistical phenomenon of regression toward the average. Because exam scores fluctu- ate partly by chance, most students who get extremely high scores on an exam will get lower scores on the next exam. If their first score is at the ceiling, their second score is more likely to fall back (“regress”) toward their own average than to push the ceiling even higher. That is why a student who does consistently good work, even if never the best, will sometimes end a course at the top of the class. Con- versely, the lowest-scoring students on the first exam are likely to improve. If those who scored lowest go for tutoring after the first exam, the tutors are likely to feel effective when the student improves, even if the tutoring had no effect. Indeed, when things reach a low point, we will try anything, and whatever we try—going to a psychotherapist, starting a new diet-exercise plan, reading a self- help book—is more likely to be followed by improvement than by further deteriora- tion. Sometimes we recognize that events are not likely to continue at an unusually

100 Part One Social Thinking good or bad extreme. Experience has taught us that when everything is going great, something will go wrong, and that when life is dealing us terrible blows, we can usually look forward to things get- ting better. Often, though, we fail to rec- ognize this regression effect. We puzzle at why baseball’s rookie of the year often has a more ordinary second year—did he become overconfident? Self-conscious? We forget that exceptional performance tends to regress toward normality. By simulating the consequences of using praise and punishment, Paul Schaffner (1985) showed how the illu- sion of control might infiltrate human relations. He invited Bowdoin College students to train an imaginary fourth- Regression to the average. When we are at an extremely low point, anything we try grade boy, “Harold,” to come to school will often seem effective. “Maybe a yoga class will improve my life.” Events seldom by 8:30 each morning. For each school continue at an abnormal low. day of a three-week period, a computer displayed Harold’s arrival time, which was always between 8:20 and 8:40. The students would then select a response to Harold, ranging from strong praise to strong reprimand. As you might expect, they usually praised Harold when he arrived before 8:30 and reprimanded him when he arrived after 8:30. Because Schaffner had programmed the computer to display a random sequence of arrival times, Harold’s arrival time tended to improve (to regress toward 8:30) after he was reprimanded. For example, if Harold arrived at 8:39, he was almost sure to be reprimanded, and his randomly selected next-day arrival time was likely to be earlier than 8:39. Thus, even though their reprimands were having no effect, most students ended the experiment believing that their repri- mands had been effective. This experiment demonstrates Tversky and Kahneman’s provocative conclu- sion: Nature operates in such a way that we often feel punished for rewarding oth- ers and rewarded for punishing them. In actuality, as every student of psychology knows, positive reinforcement for doing things right is usually more effective and has fewer negative side effects. Moods and Judgments Social judgment involves efficient, though fallible, information processing. It also involves our feelings: Our moods infuse our judgments. We are not cool computing machines; we are emotional creatures. The extent to which feeling infuses cognition appears in new studies comparing happy and sad individuals (Myers, 1993, 2000b). Unhappy people—especially those bereaved or depressed—tend to be more self- focused and brooding. A depressed mood motivates intense thinking—a search for information that makes one’s environment more understandable and controllable (Weary & Edwards, 1994). Happy people, by contrast, are more trusting, more loving, more responsive. If people are made temporarily happy by receiving a small gift while mall-shopping, they will report, a few moments later on an unrelated survey, that their cars and TV sets are working beautifully—better, if you took their word for it, than those belonging to folks who replied after not receiving gifts. Moods pervade our thinking. To West Germans enjoying their team’s World Cup soccer victory (Schwarz & others, 1987) and to Australians emerging from

Social Beliefs and Judgments Chapter 3 101 Percent perceived behaviors People put in FIGURE :: 3.3 45 a good mood A temporary good or bad mood 40 strongly influenced people’s rat- ings of their videotaped behavior. 35 Those in a bad mood detected far fewer positive behaviors. 30 Source: Forgas & others, 1984. 25 People put in 20 a bad mood 15 Positive Negative behaviors behaviors detected detected a heartwarming movie (Forgas & Moylan, 1987), people seem good-hearted, life seems wonderful. After (but not before) a 1990 football game between rivals Alabama and Auburn, victorious Alabama fans deemed war less likely and poten- tially devastating than did the gloomier Auburn fans (Schweitzer & others, 1992). When we are in a happy mood, the world seems friendlier, decisions are easier, good news more readily comes to mind (DeSteno & others, 2000; Isen & Means, 1983; Stone & Glass, 1986). Let a mood turn gloomy, however, and thoughts switch onto a different track. Off come the rose-colored glasses; on come the dark glasses. Now the bad mood primes our recollections of negative events (Bower, 1987; Johnson & Magaro, 1987). Our relationships seem to sour. Our self-images take a dive. Our hopes for the future dim. And other people’s behavior seems more sinister (Brown & Taylor, 1986; Mayer & Salovey, 1987). University of New South Wales social psychologist Joseph Forgas (1999, 2008) had often been struck by how moody people’s “memories and judgments change with the color of their mood.” To understand this “mood infusion” he began to experiment. Imagine yourself in one such study. Using hypnosis, For- gas and his colleagues (1984) put you in a good or a bad mood and then have you watch a videotape (made the day before) of yourself talking with someone. If made to feel happy, you feel pleased with what you see, and you are able to detect many instances of your poise, interest, and social skill. If you’ve been put in a bad mood, viewing the same tape seems to reveal a quite different you—one who is stiff, nervous, and inarticulate (Figure 3.3). Given how your mood colors your judgments, you feel relieved at how things brighten when the experimenter switches you to a happy mood before leaving the experiment. Curiously, note Michael Ross and Garth Fletcher (1985), we don’t attribute our changing perceptions to our mood shifts. Rather, the world really seems different. Our moods color how we judge our worlds partly by bringing to mind past experiences associated with the mood. When we are in a bad mood, we have more depressing thoughts. Mood-related thoughts may distract us from complex think- ing about something else. Thus, when emotionally aroused—when angry or even in a very good mood—we become more likely to make snap judgments and eval- uate others based on stereotypes (Bodenhausen & others, 1994; Paulhus & Lim, 1994).

102 Part One Social Thinking Summing Up: Judging Our Social World • We have an enormous capacity for automatic, effi- base-rate information. This is partly due to the cient, intuitive thinking. Our cognitive efficiency, later ease of recall of vivid information (the avail- though generally adaptive, comes at the price of occa- ability heuristic). sional error. Since we are generally unaware of those errors entering our thinking, it is useful to identify • Third, we are often swayed by illusions of correla- ways in which we form and sustain false beliefs. tion and personal control. It is tempting to perceive correlations where none exist (illusory correlation) • First, we often overestimate our judgments. This and to think we can predict or control chance overconfidence phenomenon stems partly from the events (the illusion of control). much greater ease with which we can imagine why we might be right than why we might be wrong. • Finally, moods infuse judgments. Good and bad Moreover, people are much more likely to search moods trigger memories of experiences associated for information that can confirm their beliefs than with those moods. Moods color our interpreta- for information that can disconfirm them. tions of current experiences. And by distracting us, moods can also influence how deeply or superfi- • Second, when given compelling anecdotes or cially we think when making judgments. even useless information, we often ignore useful Explaining Our Social Worlds People make it their business to explain other people, and social psychologists make it their business to explain people’s explanations. So, how—and how accurately—do people explain others’ behavior? Attribution theory suggests some answers. Our judgments of people depend on how we explain their behavior. Depending on our explanation, we may judge killing as murder, manslaughter, self-defense, or heroism. Depending on our explanation, we may view a homeless person as lacking initiative or as victimized by job and welfare cutbacks. Depending on our explanation, we may interpret someone’s friendly behavior as genuine warmth or as ingratiation. misattribution Attributing Causality: To the Person Mistakenly attributing a or the Situation behavior to the wrong source. We endlessly analyze and discuss why things happen as they do, especially when we experience something negative or unexpected (Bohner & others, 1988; Weiner, 1985). If worker productivity declines, do we assume the workers are getting lazier? Or has their workplace become less efficient? Does a young boy who hits his class- mates have a hostile personality? Or is he responding to relentless teasing? Amy Holtzworth-Munroe and Neil Jacobson (1985, 1988) found that married people often analyze their partners’ behaviors, especially their negative behaviors. Cold hostility, more than a warm hug, is likely to leave the partner wondering why? Spouses’ answers correlate with marriage satisfaction. Unhappy couples usually offer distress-maintaining explanations for negative acts (“she was late because she doesn’t care about me”). Happy couples more often externalize (“she was late because of heavy traffic”). With positive partner behavior, their explanations simi- larly work either to maintain distress (“he brought me flowers because he wants sex”) or to enhance the relationship (“he brought me flowers to show he loves me”) (Hewstone & Fincham, 1996; McNulty & others, 2008; Weiner, 1995). Antonia Abbey (1987, 1991; Abbey & Ross, 1998) and her colleagues have repeat- edly found that men are more likely than women to attribute a woman’s friendli- ness to mild sexual interest. That misreading of warmth as a sexual come-on—an example of misattribution—can contribute to behavior that women regard as sex- ual harassment or even rape (Farris & others, 2008; Kolivas & Gross, 2007; Pryor &

Social Beliefs and Judgments Chapter 3 103 others, 1997). Many men believe women are flattered by repeated requests for dates, which women more often view as harassment (Rotundo & others, 2001). Misattribution is particularly likely when men are in positions of power. A manager may misinterpret a subordinate woman’s submissive or friendly behavior and, full of himself, see her in sexual terms (Bargh & Raymond, 1995). Men more often than women think about sex (see Chapter 5). Men also are more likely than women to assume that others share their feelings (recall from Chapter 2 the “false consensus effect”). Thus, a man may greatly overesti- mate the sexual significance of a woman’s courtesy smile (Levesque & others, 2006; Nelson & LeBoeuf, 2002). Such misattributions help explain the A misattribution? Date rape sometimes results from a man’s misreading a woman’s greater sexual assertiveness exhibited warmth as a sexual come-on. by men across the world and the greater tendency of men in various cultures, from Boston to Bombay, to justify rape by arguing that the victim consented or implied consent (Kanekar & Nazareth, 1988; Muehlenhard, 1988; Shotland, 1989). Women more often judge forcible sex as mer- iting conviction and a stiff sentence (Schutte & Hosch, 1997). Misattributions also help explain why the 23 percent of American women who say they have been forced into unwanted sexual behavior is eight times greater than the 3 percent of American men who say they have ever forced a woman into a sexual act (Laumann & others, 1994). Attribution theory analyzes how we explain people’s behavior. The variations attribution theory of attribution theory share some common assumptions. As Daniel Gilbert and Pat- The theory of how people rick Malone (1995) explain, each “construes the human skin as a special boundary explain others’ behavior—for that separates one set of ‘causal forces’ from another. On the sunny side of the epi- example, by attributing it dermis are the external or situational forces that press inward upon the person, and either to internal dispositions on the meaty side are the internal or personal forces that exert pressure outward. (enduring traits, motives, Sometimes these forces press in conjunction, sometimes in opposition, and their and attitudes) or to external dynamic interplay manifests itself as observable behavior.” situations. ScienceCartoonsPlus.com.

104 Part One Social Thinking To what should we attribute a student’s sleepiness? To lack of sleep? To boredom? Whether we make internal or external attributions depends on whether we notice her consistently sleeping in this and other classes, and on whether other students react as she does to this particular class. dispositional Attribution theory pioneer Fritz Heider (1958) and others after him analyzed the attribution “commonsense psychology” by which people explain everyday events. They con- Attributing behavior to the cluded that when we observe someone acting intentionally, we sometimes attribute person’s disposition and that person’s behavior to internal causes (for example, the person’s disposition) and traits. sometimes to external causes (for example, something about the person’s situation). A teacher may wonder whether a child’s underachievement is due to lack of moti- situational attribution vation and ability (a dispositional attribution) or to physical and social circum- Attributing behavior to the stances (a situational attribution). Also, some of us are more inclined to attribute environment. behavior to stable personality; others tend more to attribute behavior to situations (Bastian & Haslam, 2006; Robins & others, 2004). spontaneous trait inference INFERRING TRAITS An effortless, automatic inference of a trait after Edward Jones and Keith Davis (1965) noted that we often infer that other people’s exposure to someone’s actions are indicative of their intentions and dispositions. If I observe Rick making behavior. a sarcastic comment to Linda, I infer that Rick is a hostile person. Jones and Davis’s “theory of correspondent inferences” specified the conditions under which people infer traits. For example, normal or expected behavior tells us less about the person than does unusual behavior. If Samantha is sarcastic in a job interview, where a person would normally be pleasant, that tells us more about Samantha than if she is sarcastic with her siblings. The ease with which we infer traits—a phenomenon called spontaneous trait inference—is remarkable. In experiments at New York University, James Uleman (1989; Uleman & others, 2008) gave students statements to remember, such as “The librarian carries the old woman’s groceries across the street.” The students would instantly, unintentionally, and unconsciously infer a trait. When later they were helped to recall the sentence, the most valuable clue word was not “books” (to cue librarian) or “bags” (to cue groceries) but “helpful”—the inferred trait that I suspect you, too, spontaneously attributed to the librarian. Given even just 1/10th of a sec- ond exposure to someone’s face, people will spontaneously infer some personality traits (Willis & Todorov, 2006). COMMONSENSE ATTRIBUTIONS As the theory of correspondent inferences suggests, attributions often are rational. Pioneering attribution theorist Harold Kelley (1973) described how we explain behavior by using information about “consistency,” “distinctiveness,” and “con- sensus” (Figure 3.4).

Social Beliefs and Judgments Chapter 3 105 Consistency: FIGURE :: 3.4 Does this person usually behave this way in this situation? (If yes, we seek an explanation.) Harold Kelley’s Theory of Attributions YES Three factors—consistency, Distinctiveness: distinctiveness, and consensus— influence whether we attribute External YES Does this person behave NO Internal someone’s behavior to internal attribution attribution or external causes. Try creating (high differently in this (low your own examples, such as: If (to the distinctiveness) situation than in others? distinctiveness) (to the Mary and many others criticize person's person's Steve (with consensus), and if YES disposition) Mary isn’t critical of others (high situation) distinctiveness), then we make an external attribution (it’s some- (high consYeEnSsus) Consensus: (low NcoOnsensus) thing about Steve). If Mary alone (low consensus) criticizes Steve, Do others behave and if she criticizes many other similarly in this situation? people, too (low distinctiveness), then we are drawn to an internal Consistency: How consistent is the person’s behavior in this situation? attribution (it’s something about Mary). Distinctiveness: How specific is the person’s behavior to this particular situation? Consensus: To what extent do others in this situation behave similarly? When explaining why Edgar is having trouble with his computer, most people use information concerning consistency (Is Edgar usually unable to get his computer to work?), distinctiveness (Does Edgar have trouble with other computers, or only this one?), and consensus (Do other people have similar problems with this make of computer?). If we learn that Edgar alone consistently has trouble with this and other computers, we likely will attribute the troubles to Edgar, not to defects in this computer. So our commonsense psychology often explains behavior logically. But Kel- ley also found that people often discount a contributing cause of behavior if other plausible causes are already known. If we can specify one or two sufficient rea- sons a student might have done poorly on an exam, we often ignore or discount alternative possibilities (McClure, 1998). Or consider this: Would you guess that people would overestimate or underestimate the frequency of a very famous name, such as “Bush,” in the American population? It surprised me—you, too?—to read Daniel Oppenheimer’s (2004) discovery that people underestimate the frequency of hyperfamous names such as Bush relative to an equally common name such as Stevenson. They do so because their familiarity with the name can be attributed to President Bush, which leads them to discount other reasons for their familiarity with “Bush.” The Fundamental Attribution Error Social psychology’s most important lesson concerns the influence of our social environment. At any moment, our internal state, and therefore what we say and do, depends on the situation as well as on what we bring to the situation. In exper- iments, a slight difference between two situations sometimes greatly affects how people respond. As a professor, I have seen this when teaching the same subject at both 8:30 a.m. and 7:00 p.m. Silent stares would greet me at 8:30; at 7:00 I had to break up a party. In each situation some individuals were more talkative than others, but the difference between the two situations exceeded the individual differences. Attribution researchers have found a common problem with our attributions. When explaining someone’s behavior, we often underestimate the impact of the situation and overestimate the extent to which it reflects the individual’s traits and attitudes. Thus, even knowing the effect of the time of day on classroom

106 Part One Social Thinking FIGURE :: 3.5 Pro-Castro Attitude attributed The Fundamental 80 Attribution Error Pro-Castro speeches When people read a debate 70 Anti-Castro speeches speech supporting or attack- ing Fidel Castro, they attributed 60 Anti-Castro attitudes attributed corresponding attitudes to the to anti-Castro debaters speechwriter, even when the debate coach assigned the 50 writer’s position. 40 Source: Data from Jones & Har- ris, 1967. 30 20 Anti-Castro 10 Chose to give a Assigned to give a Castro speech Castro speech fundamental conversation, I found it terribly tempting to assume that the people in the 7:00 p.m. attribution error class were more extraverted than the “silent types” who came at 8:30 a.m. Likewise, The tendency for observers we may infer that people fall because they’re clumsy, rather than because they were to underestimate situational tripped; that people smile because they’re happy rather than faking friendliness; influences and overestimate that people speed past us on the highway because they’re aggressive rather than dispositional influences late for an important meeting. upon others‘ behavior. (Also called correspondence bias, This discounting of the situation, dubbed by Lee Ross (1977) the fundamental because we so often see attribution error, appears in many experiments. In the first such study, Edward behavior as corresponding to Jones and Victor Harris (1967) had Duke University students read debaters’ speeches a disposition.) supporting or attacking Cuba’s leader, Fidel Castro. When told that the debater chose which position to take, the students logically enough assumed it reflected the person’s own attitude. But what happened when the students were told that the debate coach had assigned the position? People who are merely feigning a position write more forceful statements than you’d expect (Allison & others, 1993; Miller & others, 1990). Thus, even knowing that the debater had been told to take a pro- or anti-Castro position did not prevent students from inferring that the debater in fact had the assigned leanings (Figure 3.5). People seemed to think, “Yeah, I know he was assigned that position, but, you know, I think he really believes it.” The error is so irresistible that even when people know they are causing someone else’s behavior, they still underestimate external influences. If individuals dictate an opinion that someone else must then express, they still tend to see the person as actually holding that opinion (Gilbert & Jones, 1986). If people are asked to be either self-enhancing or self-deprecating during an interview, they are very aware of why they are acting so. But they are unaware of their effect on another person. If Juan acts modestly, his naive partner Bob is likely to exhibit modesty as well. Juan will easily understand his own behavior, but he will think that poor Bob suffers from low self-esteem (Baumeister & others, 1988). In short, we tend to presume that others are the way they act. Observing Cinderella cowering in her oppressive home, people (ignoring the situation) infer that she is meek; dancing with her at the ball, the prince sees a suave and glamorous person. The discounting of social constraints was evident in a thought-provoking experi- ment by Lee Ross and his collaborators (Ross & others, 1977). The experiment re-created Ross’s firsthand experience of moving from graduate student to profes- sor. His doctoral oral exam had proved a humbling experience as his apparently brilliant professors quizzed him on topics they specialized in. Six months later,

Social Beliefs and Judgments Chapter 3 107 When viewing a movie actor playing a “good-guy” or a “bad-guy” role, we find it difficult to escape the illusion that the scripted behavior reflects an inner disposition. Perhaps that is why Leonard Nimoy, who played Mr. Spock in the original “Star Trek” series, titled one of his books I Am Not Spock. Dr. Ross was himself an examiner, now able to ask penetrating questions on his favorite topics. Ross’s hapless student later confessed to feeling exactly as Ross had a half-year before—dissatisfied with his ignorance and impressed with the appar- ent brilliance of the examiners. In the experiment, with Teresa Amabile and Julia Steinmetz, Ross set up a simu- lated quiz game. He randomly assigned some Stanford University students to play the role of questioner, some to play the role of contestant, and others to observe. The researchers invited the questioners to make up difficult questions that would demonstrate their wealth of knowledge. Any one of us can imagine such questions using one’s own domain of competence: “Where is Bainbridge Island?” “How did Mary, Queen of Scots, die?” “Which has the longer coastline, Europe or Africa?” If even those few questions have you feeling a little uninformed, then you will appre- ciate the results of this experiment.* Everyone had to know that the questioner would have the advantage. Yet both contestants and observers (but not the questioners) came to the erroneous con- clusion that the questioners really were more knowledgeable than the contestants (Figure 3.6). Follow-up research shows that these misimpressions are hardly a reflection of low social intelligence. If anything, intelligent and socially competent people are more likely to make the attribution error (Block & Funder, 1986). In real life, those with social power usually initiate and control conversations, which often leads underlings to overestimate their knowledge and intelligence. Medical doctors, for example, are often presumed to be experts on all sorts of ques- tions unrelated to medicine. Similarly, students often overestimate the brilliance of their teachers. (As in the experiment, teachers are questioners on subjects of their special expertise.) When some of these students later become teachers, they are usu- ally amazed to discover that teachers are not so brilliant after all. To illustrate the fundamental attribution error, most of us need look no further than our own experiences. Determined to make some new friends, Bev plasters a smile on her face and anxiously plunges into a party. Everyone else seems quite relaxed and happy as they laugh and talk with one another. Bev wonders to herself, “Why is everyone always so at ease in groups like this while I’m feeling shy and * Bainbridge Island is across Puget Sound from Seattle. Mary was ordered beheaded by her cousin Queen Elizabeth I. Although the African continent is more than double the area of Europe, Europe’s coastline is longer. (It is more convo- luted, with lots of harbors and inlets, a geographical fact that contributed to its role in the history of maritime trade.)

108 Part One Social Thinking FIGURE :: 3.6 Rating of general knowledge 100 Both contestants and observ- ers of a simulated quiz game Questioner assumed that a person who had 90 been randomly assigned the role of questioner was far more Contestant knowledgeable than the contes- 80 tant. Actually the assigned roles of questioner and contestant Questioners simply made the questioner seem 70 perceived as more knowledgeable. The failure to appreciate this illustrates the knowledgeable fundamental attribution error. 60 Source: Data from Ross, Amabile, 50 Average student & Steinmetz, 1977. 40 30 Observers’ ratings 20 10 0 Contestants’ ratings tense?” Actually, everyone else is feeling nervous, too, and making the same attri- bution error in assuming that Bev and the others are as they appear—confidently convivial. WHY DO WE MAKE THE ATTRIBUTION ERROR? So far we have seen a bias in the way we explain other people’s behavior: We often ignore powerful situational determinants. Why do we tend to underestimate the situational determinants of others’ behavior but not of our People often attribute keen intelligence to those, such as own? teachers and quiz show hosts, who test others’ knowledge. PERSPECTIVE AND SITUATIONAL AWARENESS Actor versus Observer Perspectives? Attribution theorists pointed out that we observe others from a different per- spective than we observe ourselves (Jones, 1976; Jones & Nisbett, 1971). When we act, the environment commands our attention. When we watch another person act, that per- son occupies the center of our attention and the environ- ment becomes relatively invisible. Auschwitz commandant Rudolph Höss (1959), while acting as a good SS officer “who could not show the slightest trace of emotion,” pro- fessed inner anguish over his genocidal actions, saying he felt “pity so great that I longed to vanish from the scene.” Yet he inferred that his similarly stoic Jewish inmates were uncaring—a “racial characteristic,” he presumed—as they led fellow inmates to the gas chambers. From his analysis of 173 studies, Bertram Malle (2006) concluded that the actor-observer difference is minimal. When our action feels intentional and admirable, we attri- bute it to our own good reasons, not to the situation. It’s only when we behave badly that we’re more likely to attri- bute our behavior to the situation, while someone observing us may spontaneously infer a trait.

Social Beliefs and Judgments Chapter 3 109 The fundamental attribution error: observers underestimating the situation. Driving into a gas station, we may think the person parked at the second pump (blocking access to the first) is inconsiderate. That person, having arrived when the first pump was in use, attributes her behavior to the situation. The Camera Perspective Bias. In some experiments, people have viewed a videotape “And in imagination he began of a suspect confessing during a police interview. If they viewed the confession to recall the best moments of through a camera focused on the suspect, they perceived the confession as genuine. his pleasant life. . . . But the If they viewed it through a camera focused on the detective, they perceived it as child who had experienced more coerced (Lassiter & others, 1986, 2005, 2007). The camera perspective influ- that happiness existed no lon- enced people’s guilt judgments even when the judge instructed them not to allow ger, it was like a reminiscence this to happen (Lassiter & others, 2002). of somebody else.” In courtrooms, most confession videotapes focus on the confessor. As we might —LEO TOLSTOY, THE DEATH expect, noted Daniel Lassiter and Kimberly Dudley (1991), such tapes yield a nearly OF IVAN ILYICH, 1886 100 percent conviction rate when played by prosecutors. Aware of this research, reports Lassiter, New Zealand has made it a national policy that police interroga- tions be filmed with equal focus on the officer and the suspect, such as by filming them with side profiles of both. Perspectives Change with Time. As the once-visible person recedes in their memory, observers often give more and more credit to the situation. As we saw above in the groundbreaking attribution error experiment by Edward Jones and Victor Harris (1967), immediately after hearing someone argue an assigned position, people assume that’s how the person really felt. Jerry Burger and M. L. Palmer (1991) found that a week later they are much more ready to credit the situational constraints. The day after a presidential election, Burger and Julie Pavelich (1994) asked voters why the election turned out as it did. Most attrib- uted the outcome to the candidates’ personal traits and positions (the winner from the incumbent party was likable). When they asked other voters the same question a year later, only a third attributed the verdict to the candidates. More people now credited circumstances, such as the country’s good mood and the robust economy. Let’s make this personal: Are you generally quiet, talkative, or does it depend on the situation? “Depends on the situation” is a common answer. But when asked to describe a friend—or to describe what they were like five years ago—people more often ascribe trait descriptions. When recalling our past. we become like observers of someone else, note researchers Emily Pronin and Lee Ross (2006). For most of us, the “old you” is someone other than today’s “real you.” We regard our distant past selves (and our distant future selves) almost as if they were other people occupying our body.

110 Part One Social Thinking self-awareness Self-Awareness. Circumstances can also shift our perspective on ourselves. Seeing A self-conscious state in ourselves on television redirects our attention to ourselves. Seeing ourselves in a which attention focuses on mirror, hearing our tape-recorded voices, having our pictures taken, or filling out oneself. It makes people biographical questionnaires similarly focuses our attention inward, making us self- more sensitive to their own conscious instead of situation-conscious. Looking back on ill-fated relationships that attitudes and dispositions. once seemed like the unsinkable Titanic, people can more easily see the icebergs (Berscheid, 1999). Focusing on the person. Would you infer that your Robert Wicklund, Shelley Duval, and their collaborators have explored the professor for this course, or effects of self-awareness (Duval & Wicklund, 1972; Silvia & Duval, 2001). When the professor shown here, is our attention focuses upon ourselves, we often attribute responsibility to ourselves. naturally outgoing? Allan Fenigstein and Charles Carver (1978) demonstrated this by having students imagine themselves in hypothetical situations. Some students were made self- aware by thinking they were hearing their own heartbeats while pondering the situation. Compared with those who thought they were just hearing extraneous noises, the self-aware students saw themselves as more responsible for the imag- ined outcome. Some people are typically quite self-conscious. In experiments, people who report themselves as privately self-conscious (who agree with statements such as “I’m generally attentive to my inner feelings”) behave similarly to people whose attention has been self-focused with a mirror (Carver & Scheier, 1978). Thus, peo- ple whose attention focuses on themselves—either briefly during an experiment or because they are self-conscious persons—view themselves more as observers typically do; they attribute their behavior more to internal factors and less to the situation. All these experiments point to a reason for the attribution error: We find causes where we look for them. To see this in your own experience, consider: Would you say your social psychology instructor is a quiet or a talkative person? My guess is you inferred that he or she is fairly outgoing. But consider: Your attention focuses on your instructor while he or she behaves in a public context that demands speaking. The instructor also observes his or her own behavior in many different situations—in the classroom, in meetings, at home. “Me talkative?” your instructor might say. “Well, it all depends on the situation. When I’m in class or with good friends, I’m rather outgoing. But at conventions and in unfamiliar situa- tions I feel and act rather shy.” Because we are acutely aware of how our behavior varies with the situation, we see ourselves as more variable than other people (Bax- ter & Goldberg, 1987; Kammer, 1982; Sande & others, 1988). “Nigel is uptight, Fiona is relaxed. With me it varies.” CULTURAL DIFFERENCES Cultures also influence attribution error (Ickes, 1980; Watson, 1982). A Western world- view predisposes people to assume that people, not situations, cause events. Internal explanations are more socially approved (Jellison & Green, 1981). “You can do it!” we are assured by the pop psychology of positive- thinking Western culture. You get what you deserve and deserve what you get. As children grow up in Western cul- ture, they learn to explain behavior in terms of the other’s personal charac- teristics (Rholes & others, 1990; Ross, 1981). As a first-grader, one of my sons brought home an example. He

Social Beliefs and Judgments Chapter 3 111 unscrambled the words “gate the sleeve caught Tom on his” into “The gate caught Tom on his sleeve.” His teacher, applying the Western cultural assumptions of the curriculum materials, marked that wrong. The “right” answer located the cause within Tom: “Tom caught his sleeve on the gate.” The fundamental attribution error occurs across varied cultures (Krull & oth- ers, 1999). Yet people in Eastern Asian cultures are somewhat more sensitive to the importance of situations. Thus, when aware of the social context, they are less inclined to assume that others’ behavior corresponds to their traits (Choi & others, 1999; Farwell & Weiner, 2000; Masuda & Kitayama, 2004). Some languages promote external attributions. Instead of “I was late,” Spanish idiom allows one to say, “The clock caused me to be late.” In collectivist cultures, people less often perceive others in terms of personal dispositions (Lee & others, 1996; Zebrowitz-McArthur, 1988). They are less likely to spontaneously interpret a behavior as reflecting an inner trait (Newman, 1993). When told of someone’s actions, Hindus in India are less likely than Americans to offer dispositional expla- nations (“She is kind”) and more likely to offer situational explanations (“Her friends were with her”) (Miller, 1984). The fundamental attribution error is fundamental because it colors our explana- tions in basic and important ways. Researchers in Britain, India, Australia, and the United States have found that people’s attributions predict their attitudes toward the poor and the unemployed (Furnham, 1982; Pandey & others, 1982; Skitka, 1999; Wagstaff, 1983; Zucker & Weiner, 1993). Those who attribute poverty and unem- ployment to personal dispositions (“They’re just lazy and undeserving”) tend to adopt political positions unsympathetic to such people (Figure 3.7). This disposi- tional attribution ascribes behavior to the person’s disposition and traits. Those who make situational attributions (“If you or I were to live with the same overcrowding, poor education, and discrimination, would we be any better off?”) tend to adopt political positions that offer more direct support to the poor. Can we benefit from being aware of the attribution error? I once assisted with some interviews for a faculty position. One candidate was interviewed by six of us at once; each of us had the opportunity to ask two or three questions. I came away thinking, “What a stiff, awkward person he is.” The second candidate I met pri- vately over coffee, and we immediately discovered we had a close, mutual friend. As we talked, I became increasingly impressed by what a “warm, engaging, stimu- lating person she is.” Only later did I remember the fundamental attribution error Dispositional attribution Unfavorable FIGURE :: 3.7 (The man is a reaction hostile person.) Attributions and (I don’t like this man.) Reactions How we explain someone’s nega- tive behavior determines how we feel about it. Negative behavior (A man is rude to his colleague.) Situational attribution Sympathetic (The man was unfairly reaction evaluated.) (I can understand.)

112 Part One Social Thinking “Most poor people are not and reassess my analysis. I had attributed his stiffness and her warmth to their dis- positions; in fact, I later realized, such behavior resulted partly from the difference lazy. . . . They catch the in their interview situations. early bus. They raise other WHY WE STUDY ATTRIBUTION ERRORS people’s children. They clean This chapter, like the one before it, explains some foibles and fallacies in our social thinking. Reading about these may make it seem, as one of my students put it, the streets. No, no, they’re that “social psychologists get their kicks out of playing tricks on people.” Actu- ally, the experiments are not designed to demonstrate “what fools these mortals not lazy.” be” (although some of the experiments are rather amusing). Rather, their purpose is to reveal how we think about ourselves and others. —THE REVEREND JESSE JACKSON, ADDRESS TO THE If our capacity for illusion and self-deception is shocking, remember that our DEMOCRATIC NATIONAL CON- modes of thought are generally adaptive. Illusory thinking is often a by-product of our mind’s strategies for simplifying complex information. It parallels our per- VENTION, JULY 1988 ceptual mechanisms, which generally give us useful images of the world but some- times lead us astray. A second reason for focusing on thinking biases such as the fundamental attri- bution error is humanitarian. One of social psychology’s “great humanizing mes- sages,” note Thomas Gilovich and Richard Eibach (2001), is that people should not always be blamed for their problems. “More often than people are willing to acknowledge,” they conclude, “failure, disability, and misfortune are . . . the prod- uct of real environmental causes.” A third reason for focusing on biases is that we are mostly unaware of them and can benefit from greater awareness. As with other biases, such as the self-serving bias (Chapter 2), people see themselves as less susceptible than others to attribution errors (Pronin, 2008). My hunch is that you will find more surprises, more chal- lenges, and more benefit in an analysis of errors and biases than you would in a string of testimonies to the human capacity for logic and intellectual achievement. That is also why world literature so often portrays pride and other human failings. Social psychology aims to expose us to fallacies in our thinking in the hope that we will become more rational, more in touch with reality. The hope is not in vain: Psychology students explain behavior less simplistically than similarly intelligent natural science students (Fletcher & others, 1986). Summing Up: Explaining Our Social World • Attribution theory involves how we explain people’s other people’s behavior. We attribute their behav- behavior. Misattribution—attributing a behavior ior so much to their inner traits and attitudes that we to the wrong source—is a major factor in sexual discount situational constraints, even when those are harassment, as a person in power (typically male) obvious. We make this attribution error partly because interprets friendliness as a sexual come-on. when we watch someone act, that person is the focus of our attention and the situation is relatively invis- • Although we usually make reasonable attributions, ible. When we act, our attention is usually on what we we often commit the fundamental attribution error are reacting to—the situation is more visible. (also called correspondence bias) when explaining Expectations of Our Social Worlds Having considered how we explain and judge others—efficiently, adaptively, but sometimes erroneously—we conclude this chapter by pondering the effects of our social judgments. Do our social beliefs matter? Do they change reality? Our social beliefs and judgments do matter. They influence how we feel and act, and by so doing may help generate their own reality. When our ideas lead us to act

Social Beliefs and Judgments Chapter 3 113 focus The Self-Fulfilling Psychology of the Stock Market ON On the evening of January 6, 1981, Joseph Granville, drop more. If you expect others to buy, you buy now to a popular Florida investment adviser, wired his clients: beat the rush. “Stock prices will nosedive; sell tomorrow.” Word of Gran- ville’s advice soon spread, and January 7 became the The self-fulfilling psychology of the stock market heaviest day of trading in the previous history of the New worked to an extreme on Monday, October 19, 1987, York Stock Exchange. All told, stock values lost $40 billion. when the Dow Jones Industrial Average lost 20 percent. Part of what happens during such crashes is that the Nearly a half-century ago, John Maynard Keynes media and the rumor mill focus on whatever bad news is likened such stock market psychology to the popular available to explain them. Once reported, the explana- beauty contests then conducted by London newspa- tory news stories further diminish people’s expectations, pers. To win, one had to pick the six faces out of a hun- causing declining prices to fall still lower. The process dred that were, in turn, chosen most frequently by the also works in reverse by amplifying good news when other newspaper contestants. Thus, as Keynes wrote, stock prices are rising. “Each competitor has to pick not those faces which he himself finds prettiest, but those which he thinks likeliest In April of 2000, the volatile technology market again to catch the fancy of the other competitors.” demonstrated a self-fulfilling psychology, now called “momentum investing.” After two years of eagerly buy- Investors likewise try to pick not the stocks that touch ing stocks (because prices were rising), people started their fancy but the stocks that other investors will favor. frantically selling them (because prices were falling). The name of the game is predicting others’ behavior. As Such wild market swings—“irrational exuberance” fol- one Wall Street fund manager explained, “You may or lowed by a crash—are mainly self-generated, noted may not agree with Granville’s view—but that’s usually economist Robert Shiller (2000). In 2008 and 2009, the beside the point.” If you think his advice will cause oth- market psychology headed south again as another bub- ers to sell, then you want to sell quickly, before prices ble burst. in ways that produce their apparent confirmation, they have become what sociolo- self-fulfilling gist Robert Merton (1948) termed self-fulfilling prophecies—beliefs that lead to prophecy their own fulfillment. If, led to believe that their bank is about to crash, its custom- A belief that leads to its own ers race to withdraw their money, then their false perceptions may create reality, fulfillment. noted Merton. If people are led to believe that stocks are about to soar, they will indeed. (See “Focus On: The Self-Fulfilling Psychology of the Stock Market.”) Rosenthal (2008) recalls submitting a paper describing In his well-known studies of experimenter bias, Robert Rosenthal (1985, 2006) his early experiments on found that research participants sometimes live up to what they believe experi- experimenter bias to a menters expect of them. In one study, experimenters asked individuals to judge leading journal and to an the success of people in various photographs. The experimenters read the same American Association for instructions to all their participants and showed them the same photos. Neverthe- the Advancement of Science less, experimenters who expected their participants to see the photographed people prize competition. On the as successful obtained higher ratings than did those who expected their partici- same day, some weeks later, pants to see the people as failures. Even more startling—and controversial—are he received a letter from the reports that teachers’ beliefs about their students similarly serve as self-fulfilling journal rejecting his paper, prophecies. If a teacher believes a student is good at math, will the student do well and from the association in the class? Let’s examine this. naming it the year’s best social science research. In Teacher Expectations and Student Performance science, as in everyday life, some people appreciate what Teachers do have higher expectations for some students than for others. Perhaps others do not, which is why you have detected this after having a brother or sister precede you in school, or after it often pays to try and, when receiving a label such as “gifted” or “learning disabled,” or after being tracked with rebuffed, to try again. “high-ability” or “average-ability” students. Perhaps conversation in the teachers’ lounge sent your reputation ahead of you. Or perhaps your new teacher scrutinized

114 Part One Social Thinking To judge a teacher or your school file or discovered your family’s social status. It’s clear that teachers’ professor’s overall warmth evaluations correlate with student achievement: Teachers think well of students and enthusiasm also takes who do well. That’s mostly because teachers accurately perceive their students’ but a thin slice of behavior— abilities and achievements (Jussim, 2005). mere seconds (Ambady & Rosenthal, 1992, 1993). But are teachers’ evaluations ever a cause as well as a consequence of student performance? One correlational study of 4,300 British schoolchildren by William Crano and Phyllis Mellon (1978) suggested yes. Not only is high performance fol- lowed by higher teacher evaluations, but the reverse is true as well. Could we test this “teacher-expectations effect” experimentally? Pretend we gave a teacher the impression that Dana, Sally, Todd, and Manuel—four randomly selected students—are unusually capable. Will the teacher give special treatment to these four and elicit superior performance from them? In a now-famous experi- ment, Rosenthal and Lenore Jacobson (1968) reported precisely that. Randomly selected children in a San Francisco elementary school who were said (on the basis of a fictitious test) to be on the verge of a dramatic intellectual spurt did then spurt ahead in IQ score. That dramatic result seemed to suggest that the school problems of “disadvan- taged” children might reflect their teachers’ low expectations. The findings were soon publicized in the national media as well as in many college textbooks. How- ever, further analysis—which was not as highly publicized—revealed the teacher- expectations effect to be not as powerful and reliable as this initial study had led many people to believe (Spitz, 1999). By Rosenthal’s own count, in only about 4 in 10 of the nearly 500 published experiments did expectations significantly affect per- formance (Rosenthal, 1991, 2002). Low expectations do not doom a capable child, nor do high expectations magically transform a slow learner into a valedictorian. Human nature is not so pliable. High expectations do, however, seem to boost low achievers, for whom a teach- er’s positive attitude may be a hope-giving breath of fresh air (Madon & others, 1997). How are such expectations transmitted? Rosenthal and other investigators report that teachers look, smile, and nod more at “high-potential students.” Teach- ers also may teach more to their “gifted” students, set higher goals for them, call on them more, and give them more time to answer (Cooper, 1983; Harris & Rosenthal, 1985, 1986; Jussim, 1986). In one study, Elisha Babad, Frank Bernieri, and Rosenthal (1991) videotaped teachers talking to, or about, unseen students for whom they held high or low expectations. A random 10-second clip of either the teacher’s voice or the teacher’s face was enough to tell viewers—both children and adults—whether this was a good or a poor student and how much the teacher liked the student. (You read that right: 10 seconds.) Although teachers may think they can conceal their feelings and behave impartially toward the class, students are acutely sensitive to teachers’ facial expressions and body movements (Figure 3.8). Reading the experiments on teacher expectations makes me wonder about the effect of students’ expectations upon their teachers. You no doubt begin many of your courses having heard “Professor Smith is interesting” and “Professor Jones is a bore.” Robert Feldman and Thomas Prohaska (1979; Feldman & Theiss, 1982) FIGURE :: 3.8 Teacher‘s expectation Teacher‘s behavior Student‘s behavior Smiling more at Rena, Self-Fulfilling “Rena‘s older brother teaching her more, Rena responds Prophecies was brilliant. I bet she calling on her more, enthusiastically. is, too.” giving more time to Teacher expectations can answer. become self-fulfilling prophecies. But for the most part, teachers’ Confirming expectations accurately reflect reality (Jussim & Harber, 2005).

Social Beliefs and Judgments Chapter 3 115 found that such expectations can affect both student and teacher. Students in a learning experiment who expected to be taught by an excellent teacher perceived their teacher (who was unaware of their expectations) as more competent and inter- esting than did students with low expectations. Furthermore, the students actu- ally learned more. In a later experiment, women who were led to expect their male instructor to be sexist had a less positive experience with him, performed worse, and rated him as less competent than did women not given the sexist expectation (Adams & others, 2006). Were these results due entirely to the students’ perceptions, or also to a self- fulfilling prophecy that affected the teacher? In a follow-up experiment, Feldman and Prohaska videotaped teachers and had observers rate their performances. Teachers were judged most capable when assigned a student who nonverbally con- veyed positive expectations. To see whether such effects might also occur in actual classrooms, a research team led by David Jamieson (1987) experimented with four Ontario high school classes taught by a newly transferred teacher. During individual interviews, they told students in two of the classes that both other students and the research team rated the teacher very highly. Compared with the control classes, students who were given positive expectations paid better attention during class. At the end of the teaching unit, they also got better grades and rated the teacher as clearer in her teaching. The attitudes that a class has toward its teacher are as important, it seems, as the teacher’s attitude toward the students. Getting from Others What We Expect So the expectations of experimenters and teachers, though usually reasonably accu- rate, occasionally act as self-fulfilling prophecies. How widespread are self-fulfilling prophecies? Do we get from others what we expect of them? Studies show that self- fulfilling prophecies also operate in work settings (with managers who have high or low expectations), in courtrooms (as judges instruct juries), and in simulated police contexts (as interrogators with guilty or innocent expectations interrogate and pressure suspects) (Kassin & others, 2003; Rosenthal, 2003, 2006). Do self-fulfilling prophecies color our personal relationships? There are times when negative expectations of someone lead us to be extra nice to that person, which induces him or her to be nice in return—thus disconfirming our expectations. But a more common finding in studies of social interaction is that, yes, we do to some extent get what we expect (Olson & others, 1996). In laboratory games, hostility nearly always begets hostility: People who perceive their opponents as noncooperative will readily induce them to be noncooperative (Kelley & Stahelski, 1970). Each party’s perception of the other as aggressive, resent- ful, and vindictive induces the other to display those behaviors in self-defense, thus creating a vicious self-perpetuating circle. In another experiment, people antici- pated interacting with another person of a different race. When led to expect that the person disliked interacting with someone of their race, they felt more anger and displayed more hostility toward the person (Butz & Plant, 2006). Likewise, whether I expect my wife to be in a bad mood or in a loving mood may affect how I relate to her, thereby inducing her to confirm my belief. So, do intimate relationships prosper when partners idealize each other? Are positive illusions of the other’s virtues self-fulfilling? Or are they more often self- defeating, by creating high expectations that can’t be met? Among University of Waterloo dating couples followed by Sandra Murray and her associates (1996a, 1996b, 2000), positive ideals of one’s partner were good omens. Idealization helped buffer conflict, bolster satisfaction, and turn self-perceived frogs into princes or princesses. When someone loves and admires us, it helps us become more the per- son he or she imagines us to be. When dating couples deal with conflicts, hopeful optimists and their partners tend to perceive each other as engaging constructively. Compared to those with

116 Part One Social Thinking Behavioral confirmation. When English soccer fans came to France for the 1998 World Cup, they were expected to live up to their reputation as aggressive “hooligans.” Local French youth and police, expecting hooligan behavior, reportedly displayed hosility toward the English, who retaliated, thus confirming the expectation (Klein & Snyder, 2003). behavioral more pessimistic expectations, they then feel more supported and more satisfied confirmation with the outcome (Srivastava & others, 2006). Among married couples, too, those A type of self-fulfilling who worry that their partner doesn’t love and accept them interpret slight hurts as prophecy whereby people’s rejections, which motivates them to devalue the partner and distance themselves. social expectations lead Those who presume their partner’s love and acceptance respond less defensively, them to behave in ways that read less into stressful events, and treat the partner better (Murray & others, 2003). cause others to confirm their Love helps create its presumed reality. expectations. Several experiments conducted by Mark Snyder (1984) at the University of Min- nesota show how, once formed, erroneous beliefs about the social world can induce others to confirm those beliefs, a phenomenon called behavioral confirmation. In a classic study, Snyder, Elizabeth Tanke, and Ellen Berscheid (1977) had male stu- dents talk on the telephone with women they thought (from having been shown a picture) were either attractive or unattractive. Analysis of just the women’s com- ments during the conversations revealed that the supposedly attractive women spoke more warmly than the supposedly unattractive women. The men’s errone- ous beliefs had become a self-fulfilling prophecy by leading them to act in a way that influenced the women to fulfill the men’s stereotype that beautiful people are desirable people. Behavioral confirmation also occurs as people interact with partners holding mistaken beliefs. People who are believed lonely behave less sociably (Rotenberg & others, 2002). Men who are believed sexist behave less favorably toward women (Pinel, 2002). Job interviewees who are believed to be warm behave more warmly. Imagine yourself as one of the 60 young men or 60 young women in an experi- ment by Robert Ridge and Jeffrey Reber (2002). Each man is to interview one of the women to assess her suitability for a teaching assistant position. Before doing so, he is told either that she feels attracted to him (based on his answers to a biographical questionnaire) or not attracted. (Imagine being told that someone you were about to meet reported considerable interest in getting to know you and in dating you, or none whatsoever.) The result was behavioral confirmation: Applicants believed to feel an attraction exhibited more flirtatiousness (and without being aware of doing so). Ridge and Reber believe that this process, like the misattribution phenomenon we discussed earlier, may be one of the roots of sexual harassment. If a woman’s behavior seems to confirm a man’s beliefs, he may then escalate his overtures until

Social Beliefs and Judgments Chapter 3 117 they become sufficiently overt for the woman to recognize and interpret them as “The more he treated her inappropriate or harassing. as though she were really very nice, the more Lotty Expectations influence children’s behavior, too. After observing the amount of expanded and became really litter in three classrooms, Richard Miller and his colleagues (1975) had the teacher very nice, and the more he, and others repeatedly tell one class that they should be neat and tidy. This persua- affected in his turn, became sion increased the amount of litter placed in wastebaskets from 15 to 45 percent, really very nice himself; so but only temporarily. Another class, which also had been placing only 15 percent that they went round and of its litter in wastebaskets, was repeatedly congratulated for being so neat and round, not in a vicious but in tidy. After eight days of hearing this, and still two weeks later, these children were a highly virtuous circle.” fulfilling the expectation by putting more than 80 percent of their litter in wastebas- —ELIZABETH VON ARNIM, THE kets. Tell children they are hardworking and kind (rather than lazy and mean), and they may live up to their labels. ENCHANTED APRIL, 1922 These experiments help us understand how social beliefs, such as stereotypes about people with disabilities or about people of a particular race or sex, may be self- confirming. How others treat us reflects how we and others have treated them. A note of caution: As with every social phenomenon, the tendency to confirm others’ expectations has its limits. Expectations often predict behavior simply because they are accurate (Jussim, 2005). Summing Up: Expectations of Our Social World • Our beliefs sometimes take on lives of their own. therefore, seem to confirm an assumption that is Usually, our beliefs about others have a basis actually false. in reality. But studies of experimenter bias and teacher expectations show that an erroneous • Similarly, in everyday life we often get behavioral belief that certain people are unusually capable confirmation of what we expect. Told that someone (or incapable) can lead teachers and research- we are about to meet is intelligent and attractive, ers to give those people special treatment. This we may come away impressed with just how intel- may elicit superior (or inferior) performance and, ligent and attractive he or she is. Conclusions Social cognition studies reveal that our information-processing powers are impressive for their efficiency and adaptiveness (“in apprehension how like a god!” exclaimed Shakespeare’s Hamlet). Yet we are also vulnerable to predictable errors and misjudg- ments (“headpiece filled with straw,” said T. S. Eliot). What practical lessons, and what insights into human nature, can we take home from this research? We have reviewed reasons why people sometimes form false beliefs. We cannot easily dismiss these experiments: Most of their participants were intelligent people, often students at leading universities. Moreover, people’s intelligence scores are uncorrelated with their vulnerability to many different thinking biases (Stanovich & West, 2008). One can be very smart and exhibit seriously bad judgment. Trying hard also doesn’t eliminate thinking biases. These predictable distortions and biases occurred even when payment for right answers motivated people to think optimally. As one researcher concluded, the illusions “have a persistent qual- ity not unlike that of perceptual illusions” (Slovic, 1972). Research in cognitive social psychology thus mirrors the mixed review given humanity in literature, philosophy, and religion. Many research psychologists have spent lifetimes exploring the awesome capacities of the human mind. We are smart enough to have cracked our own genetic code, to have invented talking computers, to have sent people to the moon. Three cheers for human reason.

118 Part One Social Thinking “In creating these problems, Well, two cheers—because the mind’s premium on efficient judgment makes we didn’t set out to fool peo- our intuition more vulnerable to misjudgment than we suspect. With remarkable ple. All our problems fooled ease, we form and sustain false beliefs. Led by our preconceptions, feeling over- us, too.” confident, persuaded by vivid anecdotes, perceiving correlations and control even where none may exist, we construct our social beliefs and then influence others to —AMOS TVERSKY (1985) confirm them. “The naked intellect,” observed novelist Madeleine L’Engle, “is an extraordinarily inaccurate instrument.” “The purposes in the human mind are like deep water, but But have these experiments just been intellectual tricks played on hapless par- the intelligent will draw them ticipants, thus making them look worse than they are? Richard Nisbett and Lee out.” Ross (1980) contended that, if anything, laboratory procedures overestimate our intuitive powers. The experiments usually present people with clear evidence and —PROVERBS 20:5 warn them that their reasoning ability is being tested. Seldom does real life say to us: “Here is some evidence. Now put on your intellectual Sunday best and answer “Cognitive errors . . . these questions.” exist in the present because they led to survival and Often our everyday failings are inconsequential, but not always so. False impres- reproductive advantages for sions, interpretations, and beliefs can produce serious consequences. Even small humans in the past.” biases can have profound social effects when we are making important social judg- —EVOLUTIONARY PSYCHOLO- ments: Why are so many people homeless? unhappy? homicidal? Does my friend GISTS MARTIE HASELTON AND love me or my money? Cognitive biases even creep into sophisticated scientific thinking. Human nature has hardly changed in the 3,000 years since the Old Testa- DAVID BUSS (2000) ment psalmist noted that “no one can see his own errors.” Is this too cynical? Leonard Martin and Ralph Erber (2005) invite us to imagine that an intelligent being swooped down and begged for information that would help it understand the human species. When you hand it this social psychology text, the alien says “thank you” and zooms back off into space. After (I’d like to presume) resolving your remorse over giving up this book, how would you feel about having offered social psychology’s analysis? Joachim Krueger and David Funder (2003a, 2003b) wouldn’t feel too good. Social psychology’s preoccupation with human foibles needs balancing with “a more positive view of human nature,” they argue. Fellow social psychologist Lee Jussim (2005) agrees, adding, “Despite the oft- demonstrated existence of a slew of logical flaws and systematic biases in lay judgment and social perception, such as the fundamental attribution error, false consensus, over-reliance on imperfect heuristics, self-serving biases, etc., people’s perceptions of one another are surprisingly (though rarely perfectly) accurate.” The elegant analyses of the imperfections of our thinking are themselves a tribute to human wisdom. Were one to argue that all human thought is illusory, the assertion would be self-refuting, for it, too, would be but an illusion. It would be logically equivalent to contending “All generalizations are false, including this one.” As medical science assumes that any given body organ serves a function, so behavioral scientists find it useful to assume that our modes of thought and behav- ior are adaptive (Funder, 1987; Kruglanski & Ajzen, 1983; Swann, 1984). The rules of thought that produce false beliefs and striking deficiencies in our statistical intu- ition usually serve us well. Frequently, the errors are a by-product of our mental shortcuts that simplify the complex information we receive. Nobel laureate psychologist Herbert Simon (1957) was among the modern researchers who first described the bounds of human reason. Simon contends that to cope with reality, we simplify it. Consider the complexity of a chess game: The number of possible games is greater than the number of particles in the uni- verse. How do we cope? We adopt some simplifying rules—heuristics. These heu- ristics sometimes lead us to defeat. But they do enable us to make efficient snap judgments. Illusory thinking can likewise spring from useful heuristics that aid our survival. In many ways, heuristics “make us smart” (Gigerenzer, 2007). The belief in our power to control events helps maintain hope and effort. If things are sometimes sub- ject to control and sometimes not, we maximize our outcomes by positive thinking.

Social Beliefs and Judgments Chapter 3 119 Optimism pays dividends. We might even say that our beliefs are like scientific “The spirit of liberty is the theories—sometimes in error yet useful as generalizations. As social psychologist spirit which is not too sure Susan Fiske (1992) says, “Thinking is for doing.” that it is right; the spirit of lib- erty is the spirit which seeks As we constantly seek to improve our theories, might we not also work to reduce to understand the minds of errors in our social thinking? In school, math teachers teach, teach, teach until the other men and women; the mind is finally trained to process numerical information accurately and automati- spirit of liberty is the spirit cally. We assume that such ability does not come naturally; otherwise, why bother which weighs their interests with the years of training? Research psychologist Robyn Dawes (1980)—who was alongside its own without dismayed that “study after study has shown [that] people have very limited abilities bias.” to process information on a conscious level, particularly social information”—sug- —LEARNED HAND, “THE SPIRIT gested that we should also teach, teach, teach how to process social information. OF LIBERTY,” 1952 Richard Nisbett and Lee Ross (1980) believe that education could indeed reduce our vulnerability to certain types of error. They offer the following recommenda- tions: • Train people to recognize likely sources of error in their own social intuition. • Set up statistics courses geared to everyday problems of logic and social judgment. Given such training, people do in fact reason better about every- day events (Lehman & others, 1988; Nisbett & others, 1987). • Make such teaching more effective by illustrating it richly with concrete, vivid anecdotes and examples from everyday life. • Teach memorable and useful slogans, such as “It’s an empirical question,” “Which hat did you draw that sample out of?” or “You can lie with statistics, but a well-chosen example does the job better.” Summing Up: Conclusions Research on social beliefs and judgments reveals how we form and sustain beliefs that usually serve us well but sometimes lead us astray. A balanced social psychology will therefore appreciate both the powers and the perils of social thinking. P.S. POSTSCRIPT: Reflecting on Illusory Thinking Is research on pride and error too humbling? Surely we can acknowledge the hard “Rob the average man of his truth of our human limits and still sympathize with the deeper message that people are more than machines. Our subjective experiences are the stuff of our humanity— life-illusion, and you rob him our art and our music, our enjoyment of friendship and love, our mystical and reli- gious experiences. also of his happiness.” —HENRIK IBSEN, THE WILD The cognitive and social psychologists who explore illusory thinking are not DUCK, 1884 out to remake us into unfeeling logical machines. They know that emotions enrich human experience and that intuitions are an important source of creative ideas. They add, however, the humbling reminder that our susceptibility to error also makes clear the need for disciplined training of the mind. The American writer Norman Cousins (1978) called this “the biggest truth of all about learning: that its purpose is to unlock the human mind and to develop it into an organ capable of thought— conceptual thought, analytical thought, sequential thought.” Research on error and illusion in social judgment reminds us to “judge not”—to remember, with a dash of humility, our potential for misjudgment. It also encour- ages us not to feel intimidated by the arrogance of those who cannot see their own potential for bias and error. We humans are wonderfully intelligent yet fallible crea- tures. We have dignity but not deity.

120 Part One Social Thinking Such humility and distrust of human authority is at the heart of both religion and science. No wonder many of the founders of modern science were religious people whose convictions predisposed them to be humble before nature and skeptical of human authority (Hooykaas, 1972; Merton, 1938). Science always involves an inter- play between intuition and rigorous test, between creative hunch and skepticism. To sift reality from illusion requires both open-minded curiosity and hard-headed rigor. This perspective could prove to be a good attitude for approaching all of life: to be critical but not cynical; curious but not gullible; open but not exploitable. Making the Social Connection The Online Learning Center for this book includes a video on each of three important topics from this chapter. First is the manner in which context influenced public perceptions of the televised campaign speech given by presidential candidate Howard Dean after the Iowa Caucus in 2000. In the second video, leading memory researcher Elizabeth Loftus explores the misinformation effect and the way it distorts our memories. Finally, Lee Ross discusses the funda- mental attribution error, a concept he formed based on his observations of how peo- ple perceive and interpret events. Keep these concepts in mind as you read future chapters, and notice the ways in which you tend to explain others’ behavior.



BehaviorC H A P T E R 4 and Attitudes

“The ancestor of every action is a thought.” —Ralph Waldo Emerson, Essays, First Series, 1841 How well do our attitudes predict our behavior? When does our behavior affect our attitudes? Why does our behavior affect our attitudes? Postscript: Changing ourselves through action What is the relationship between what we are (on the inside) and what we do (on the outside)? Philosophers, theologians, and educators have long speculated about the connections between attitude and action, character and conduct, private word and public deed. Underlying most teaching, counseling, and child rearing is an assumption: Our private beliefs and feelings determine our public behavior, so if we wish to change behavior we must first change hearts and minds. In the beginning, social psychologists agreed: To know people’s attitudes is to predict their actions. As demonstrated by genocidal kill- ers and by suicide bombers, extreme attitudes can produce extreme behavior. But in 1964 Leon Festinger concluded that the evidence showed that changing people’s attitudes hardly affects their behavior. Festinger believed the attitude-behavior relation works the other way around. As Robert Abelson (1972) put it, we are “very well trained and very good at finding reasons for what we do, but not very good at doing what we find reasons for.” This chapter explores the interplay of attitudes and behavior. When social psychologists talk about someone’s attitude, they refer to beliefs and feelings related to a person or an event and the resulting

124 Part One Social Thinking attitude behavior tendency. Taken together, favorable or unfavorable evaluative reactions A favorable or unfavorable toward something—often rooted in beliefs and exhibited in feelings and inclinations evaluative reaction toward to act—define a person’s attitude (Eagly & Chaiken, 2005). Thus, a person may have something or someone (often a negative attitude toward coffee, a neutral attitude toward the French, and a positive rooted in one’s beliefs, and attitude toward the next-door neighbor. Attitudes provide an efficient way to size up exhibited in one’s feelings the world. When we have to respond quickly to something, the way we feel about and intended behavior). it can guide how we react. For example, a person who believes a particular ethnic group is lazy and aggressive may feel dislike for such people and therefore intend to “All that we are is the result act in a discriminatory manner. of what we have thought.” The study of attitudes is close to the heart of social psychology and was one of —BUDDHA, 563 B.C.–483 B.C. its first concerns. For much of the last century, researchers wondered how much DHAMMA-PADA our attitudes affect our actions. You can remember these three dimensions as the ABCs of attitudes: affect (feelings), behavior tendency, and cognition (thoughts) “Thought is the child of (Figure 4.1). action.” —BENJAMIN DISRAELI, VIVIAN GRAY, 1926 How Well Do Our Attitudes Predict Our Behavior? To what extent, and under what conditions, do the attitudes of the heart drive our outward actions? Why were social psychologists at first surprised by a seemingly small connection between attitudes and actions? A blow to the supposed power of attitudes came when social psychologist Allan Wicker (1969) reviewed several dozen research studies covering a wide variety of people, attitudes, and behaviors. Wicker offered a shocking conclusion: People’s expressed attitudes hardly predicted their varying behaviors. • Student attitudes toward cheating bore little relation to the likelihood of their actually cheating. • Attitudes toward the church were only modestly linked with church atten- dance on any given Sunday. • Self-described racial attitudes provided little clue to behaviors in actual situations. An example of the disjuncture between attitudes and actions is what Daniel Batson and his colleagues (1997, 2001, 2002; Valdesolo & DeSteno, 2007, 2008) FIGURE :: 4.1 Behavior call “moral hypocrisy” (appearing moral while avoiding the costs of being so). The ABCs of Attitudes Their studies presented people with an appealing task (where the participant could earn raffle tickets toward a $30 prize) and a dull task with no rewards. The participants had to assign them- selves to one of the tasks and a sup- Affect Cognition posed second participant to the other. Only 1 in 20 believed that assigning the

Behavior and Attitudes Chapter 4 125 positive task to themselves was the more moral thing to do, yet 80 percent did “It may be desirable to so. In follow-up experiments on moral hypocrisy, participants were given coins abandon the attitude they could flip privately if they wished. Even if they chose to flip, 90 percent concept.” assigned themselves to the positive task! (Was that because they could specify the consequences of heads and tails after the coin toss?) In another experiment, —ALLAN WICKER (1971) Batson put a sticker on each side of the coin, indicating what the flip outcome would signify. Still, 24 of 28 people who made the toss assigned themselves to the positive task. When morality and greed were put on a collision course, greed won. If people don’t walk the same line that they talk, it’s little wonder that attempts to change behavior by changing attitudes often fail. Warnings about the dangers of smoking affect only minimally those who already smoke. Increasing public awareness of the desensitizing and brutalizing effects of television violence has stimulated many people to voice a desire for less violent programming—yet they still watch media murder as much as ever. Sex education programs have often influenced attitudes toward abstinence and condom use without affecting long-term abstinence and condom use behaviors. We are, it seems, a population of hypocrites. All in all, the developing picture of what controls behavior emphasized external social influences, such as others’ behavior and expectations, and played down inter- nal factors, such as attitudes and personality. Thus, the original thesis that attitudes determine actions was countered during the 1960s by the antithesis that attitudes determine virtually nothing. Thesis. Antithesis. Is there a synthesis? The surprising finding that what peo- ple say often differs from what they do sent social psychologists scurrying to find out why. Surely, we reasoned, convictions and feelings must sometimes make a difference. Indeed. In fact, what I am about to explain now seems so obvious that I wonder why most social psychologists (myself included) were not thinking this way before the early 1970s. I must remind myself, however, that truth never seems obvious until it is known. When Attitudes Predict Behavior The reason—now obvious—why our behavior and our expressed attitudes differ is that both are subject to other influences. Many other influences. One social psychol- ogist counted 40 factors that complicate their relationship (Triandis, 1982; see also Kraus, 1995). Our attitudes do predict our behavior when these other influences on what we say and do are minimal, when the attitude is specific to the behavior, and when the attitude is potent. WHEN SOCIAL INFLUENCES ON WHAT WE SAY ARE MINIMAL “I have opinions of my own, Unlike a physician measuring heart rate, social psychologists never get a direct strong opinions, but I don’t reading on attitudes. Rather, we measure expressed attitudes. Like other behaviors, expressions are subject to outside influences. Sometimes, for example, we say what always agree with them.” we think others want to hear. In late 2002, many U.S. legislators, sensing their coun- —PRESIDENT GEORGE H. W. try’s post-9/11 fear, anger, and patriotic fervor, publicly voted support for President BUSH Bush’s planned war against Iraq while privately having reservations (Nagourney, 2002). On the roll-call vote, strong social influence—fear of criticism—had distorted the true sentiments. Today’s social psychologists have some clever means at their disposal for mini- mizing social influences on people’s attitude reports. Some of these complement traditional self-report measures of explicit (conscious) attitudes with measures of implicit (unconscious) attitudes. One such test measures facial muscle responses to various statements (Cacioppo & Petty, 1981). Those measurements, the researchers


Like this book? You can publish your book online for free in a few minutes!
Create your own flipbook