What will people most likely do when they encounter information that is at odds with an existing schema?

Perhaps you are thinking that the use of heuristics and the tendency to be influenced by salience and accessibility don’t seem that important—who really cares if we buy an iPod when the Zune is better, or if we think there are more words that begin with the letter R than there actually are? These aren’t big problems in the overall scheme of things. But it turns out that what seem perhaps to be pretty small errors and biases on the surface can have profound consequences for people.

For one, if the errors occur for a lot of people, they can really add up. Why would so many people continue to buy lottery tickets or to gamble their money in casinos when the likelihood of them ever winning is so low? One possibility, of course, is the representative heuristic—people ignore the low base rates of winning and focus their attention on the salient likelihood of winning a huge prize. And the belief in astrology, which all scientific evidence suggests is not accurate, is probably driven in part by the salience of the occasions when the predictions do occur—when a horoscope is correct (which it will of course be sometimes), the correct prediction is highly salient and may allow people to maintain the (overall false) belief.

People may also take more care to prepare for unlikely events than for more likely ones because the unlikely ones are more salient or accessible. For instance, people may think that they are more likely to die from a terrorist attack or as the result of a homicide than they are from diabetes, stroke, or tuberculosis. But the odds are much greater of dying from the health problems than from the terrorism or homicide. Because people don’t accurately calibrate their behaviors to match the true potential risks, the individual and societal costs are quite large (Slovic, 2000).

Salience and accessibility also color how we perceive our social worlds, which may have a big influence on our behavior. For instance, people who watch a lot of violent television shows also tend to view the world as more dangerous in comparison to those who watch less violent TV (Doob & Macdonald, 1979). This follows from the idea that our judgments are based on the accessibility of relevant constructs. We also overestimate our contribution to joint projects (Ross & Sicoly, 1979), perhaps in part because our own contributions are so obvious and salient, whereas the contributions of others are much less so. And the use of cognitive heuristics can even affect our views about global warming. Joireman, Barnes, Truelove, and Duell (2010) found that people were more likely to believe in the existence of global warming when they were asked about it on hotter rather than colder days and when they had first been primed with words relating to heat. Thus the principles of salience and accessibility, because they are such an important part of our social judgments, can create a series of biases that can make a difference.

Research has found that even people who should know better—and who need to know better—are subject to cognitive biases. Economists, stock traders, managers, lawyers, and even doctors have been found to make the same kinds of mistakes in their professional activities that people make in their everyday lives (Byrne & McEleney, 2000; Gilovich, Griffin, & Kahneman, 2002; Hilton, 2001). And the use of cognitive heuristics is increased when people are under time pressure (Kruglanski & Freund, 1983) or when they feel threatened (Kassam, Koslov, & Mendes, 2009), exactly the situations that may occur when professionals are required to make their decisions.

Although biases are common, they are not impossible to control, and psychologists and other scientists are working to help people make better decisions. One possibility is to provide people with better feedback. Weather forecasters, for instance, are quite accurate in their decisions, in part because they are able to learn from the clear feedback that they get about the accuracy of their predictions. Other research has found that accessibility biases can be reduced by leading people to consider multiple alternatives rather than focusing only on the most obvious ones, and particularly by leading people to think about exactly the opposite possible outcomes than the ones they are expecting (Hirt, Kardes, & Markman, 2004). And people can also be trained to make better decisions. For instance, Lehman, Lempert, and Nisbett (1988) found that graduate students in medicine, law, and chemistry, but particularly those in psychology, all showed significant improvement in their ability to reason correctly over the course of their graduate training.

The Validity of Eyewitness Testimony

As we have seen in the story of Rickie Johnson that opens this chapter, one social situation in which the accuracy of our person-perception skills is vitally important is the area of eyewitness testimony (Charman & Wells, 2007; Toglia, Read, Ross, & Lindsay, 2007; Wells, Memon, & Penrod, 2006). Every year, thousands of individuals such as Rickie Johnson are charged with and often convicted of crimes based largely on eyewitness evidence. In fact, more than 100 people who were convicted prior to the existence of forensic DNA have now been exonerated by DNA tests, and more than 75% of these people were victims of mistaken eyewitness identification (Wells, Memon, & Penrod, 2006; Fisher, 2011).

The judgments of eyewitnesses are often incorrect, and there is only a small correlation between how accurate and how confident an eyewitness is. Witnesses are frequently overconfident, and one who claims to be absolutely certain about his or her identification is not much more likely to be accurate than one who appears much less sure, making it almost impossible to determine whether a particular witness is accurate or not (Wells & Olson, 2003).

To accurately remember a person or an event at a later time, we must be able to accurately see and store the information in the first place, keep it in memory over time, and then accurately retrieve it later. But the social situation can influence any of these processes, causing errors and biases.

In terms of initial encoding of the memory, crimes normally occur quickly, often in situations that are accompanied by a lot of stress, distraction, and arousal. Typically, the eyewitness gets only a brief glimpse of the person committing the crime, and this may be under poor lighting conditions and from far away. And the eyewitness may not always focus on the most important aspects of the scene. Weapons are highly salient, and if a weapon is present during the crime, the eyewitness may focus on the weapon, which would draw his or her attention away from the individual committing the crime (Steblay, 1997). In one relevant study, Loftus, Loftus, and Messo (1987) showed people slides of a customer walking up to a bank teller and pulling out either a pistol or a checkbook. By tracking eye movements, the researchers determined that people were more likely to look at the gun than at the checkbook and that this reduced their ability to accurately identify the criminal in a lineup that was given later.

People may be particularly inaccurate when they are asked to identify members of a race other than their own (Brigham, Bennett, Meissner, & Mitchell, 2007). In one field study, for example, Meissner and Brigham (2001) sent White, Black, and Hispanic students into convenience stores in El Paso, Texas. Each of the students made a purchase, and the researchers came in later to ask the clerks to identify photos of the shoppers. Results showed that the White, Black, and Mexican American clerks demonstrated the own-race bias: They were all more accurate at identifying customers belonging to their own racial or ethnic group than they were at identifying people from other groups. There seems to be some truth to the adage that “They all look alike”—at least if an individual is looking at someone who is not of his or her race.

Even if information gets encoded properly, memories may become distorted over time. For one thing, people might discuss what they saw with other people, or they might read information relating to it from other bystanders or in the media. Such postevent information can distort the original memories such that the witnesses are no longer sure what the real information is and what was provided later. The problem is that the new, inaccurate information is highly cognitively accessible, whereas the older information is much less so. Even describing a face makes it more difficult to recognize the face later (Dodson, Johnson, & Schooler, 1997).

In an experiment by Loftus and Palmer (1974), participants viewed a film of a traffic accident and then, according to random assignment to experimental conditions, answered one of three questions:

  1. “About how fast were the cars going when they hit each other?”
  2. “About how fast were the cars going when they smashed each other?”
  3. “About how fast were the cars going when they contacted each other?”

As you can see in in the following figure, although all the participants saw the same accident, their estimates of the speed of the cars varied by condition. People who had seen the “smashed” question estimated the highest average speed, and those who had seen the “contacted” question estimated the lowest.

Figure 2.6 Reconstructive Memory

What will people most likely do when they encounter information that is at odds with an existing schema?

Participants viewed a film of a traffic accident and then answered a question about the accident. According to random assignment, the blank was filled by either “hit,” “smashed,” or “contacted” each other. The wording of the question influenced the participants’ memory of the accident. Data are from Loftus and Palmer (1974).

The situation is particularly problematic when the eyewitnesses are children, because research has found that children are more likely to make incorrect identifications than are adults (Pozzulo & Lindsay, 1998) and are also subject to the own-race identification bias (Pezdek, Blandon-Gitlin, & Moore, 2003). In many cases, when sex abuse charges have been filed against babysitters, teachers, religious officials, and family members, the children are the only source of evidence. The likelihood that children are not accurately remembering the events that have occurred to them creates substantial problems for the legal system.

Another setting in which eyewitnesses may be inaccurate is when they try to identify suspects from mug shots or lineups. A lineup generally includes the suspect and five to seven other innocent people (the fillers), and the eyewitness must pick out the true perpetrator. The problem is that eyewitnesses typically feel pressured to pick a suspect out of the lineup, which increases the likelihood that they will mistakenly pick someone (rather than no one) as the suspect.

Research has attempted to better understand how people remember and potentially misremember the scenes of and people involved in crimes and to attempt to improve how the legal system makes use of eyewitness testimony. In many states, efforts are being made to better inform judges, juries, and lawyers about how inaccurate eyewitness testimony can be. Guidelines have also been proposed to help ensure that child witnesses are questioned in a nonbiasing way (Poole & Lamb, 1998). Steps can also be taken to ensure that lineups yield more accurate eyewitness identifications. Lineups are more fair when the fillers resemble the suspect, when the interviewer makes it clear that the suspect might or might not be present (Steblay, Dysart, Fulero, & Lindsay, 2001), and when the eyewitness has not been shown the same pictures in a mug-shot book prior to the lineup decision. And several recent studies have found that witnesses who make accurate identifications from a lineup reach their decision faster than do witnesses who make mistaken identifications, suggesting that authorities must take into consideration not only the response but how fast it is given (Dunning & Perretta, 2002).

In addition to distorting our memories for events that have actually occurred, misinformation may lead us to falsely remember information that never occurred. Loftus and her colleagues asked parents to provide them with descriptions of events that did (e.g., moving to a new house) and did not (e.g., being lost in a shopping mall) happen to their children. Then (without telling the children which events were real or made-up) the researchers asked the children to imagine both types of events. The children were instructed to “think real hard” about whether the events had occurred (Ceci, Huffman, Smith, & Loftus, 1994). More than half of the children generated stories regarding at least one of the made-up events, and they remained insistent that the events did in fact occur even when told by the researcher that they could not possibly have occurred (Loftus & Pickrell, 1995). Even college students are susceptible to manipulations that make events that did not actually occur seem as if they did (Mazzoni, Loftus, & Kirsch, 2001).

The ease with which memories can be created or implanted is particularly problematic when the events to be recalled have important consequences. Therapists often argue that patients may repress memories of traumatic events they experienced as children, such as childhood sexual abuse, and then recover the events years later as the therapist leads them to recall the information—for instance, by using dream interpretation and hypnosis (Brown, Scheflin, & Hammond, 1998).

But other researchers argue that painful memories such as sexual abuse are usually very well remembered, that few memories are actually repressed, and that even if they are, it is virtually impossible for patients to accurately retrieve them years later (McNally, Bryant, & Ehlers, 2003; Pope, Poliakoff, Parker, Boynes, & Hudson, 2007). These researchers have argued that the procedures used by the therapists to “retrieve” the memories are more likely to actually implant false memories, leading the patients to erroneously recall events that did not actually occur. Because hundreds of people have been accused, and even imprisoned, on the basis of claims about “recovered memory” of child sexual abuse, the accuracy of these memories has important societal implications. Many psychologists now believe that most of these claims of recovered memories are due to implanted, rather than real, memories (Loftus & Ketcham, 1994).

Taken together, then, the problems of eyewitness testimony represent another example of how social cognition—the processes that we use to size up and remember other people—may be influenced, sometimes in a way that creates inaccurate perceptions, by the operation of salience, cognitive accessibility, and other information-processing biases.