With support from the University of Richmond

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

“There Must Be a Reason”: Osama, Saddam, and Inferred Justification

Monica Prasad, Northwestern University
Andrew J. Perrin, University of North Carolina, Chapel Hill
Kieran Bezila, Northwestern University
Steve G. Hoffman, University at Buffalo, State University of New York (SUNY)
Kate Kindleberger, Northwestern University
Kim Manturuk, University of North Carolina, Chapel Hill
Ashleigh Smith Powers, Millsaps College

One of the most curious aspects of the 2004 presidential election was the strength and resilience of the belief among many Americans that Saddam Hussein was linked to the terrorist attacks of September 11. Scholars have suggested that this belief was the result of a campaign of false information and innuendo from the Bush administration. We call this the information environment explanation. Using a technique of “challenge interviews” on a sample of voters who reported believing in a link between Saddam and 9/11, we propose instead a social psychological explanation for the belief in this link. We identify a number of social psychological mechanisms voters use to maintain false beliefs in the face of disconfirming information, and we show that for a subset of voters the main reason to believe in the link was that it made sense of the administration’s decision to go to war against Iraq. We call this inferred justification: for these voters, the fact of the war led to a search for a justification for it, which led them to infer the existence of ties between Iraq and 9/11.

***

Ronald Reagan once remarked that “the trouble with our liberal friends is not that they are ignorant, but that they know so much that isn’t so” (Reagan 1964). His comment goes to the heart of one of the most contentious issues in democratic theory: how should democracies handle mistaken beliefs? False beliefs present a potentially serious challenge to democratic theory and practice, as citizens with incorrect information cannot form appropriate preferences or evaluate the preferences of others. Kuklinski and colleagues (2002) have demonstrated that incorrect beliefs—as distinct from mere lack of information, a more thoroughly studied phenomenon (e.g., Delli Carpini and Keeter 1997)— are widespread and underlie substantial differences in policy preferences. One of the most curious “false beliefs” of the 2004 presidential election was the strength and resilience of the belief among many Americans that Saddam Hussein was linked to the terrorist attacks of September 11, 2001. Over the course of the campaign, several polls showed that majorities of respondents believed that Saddam Hussein was either partly or largely responsible for the 9/11 attacks (see Althaus and Largio 2004, for a summary of polling evidence, and Kull, Ramsay, and Lewis 2003, on closely related questions). This percentage declined slowly, dipping below 50 percent only in late 2003 (Everts and Isernia 2005). This misperception persisted despite mounting evidence and a broad official consensus that no such link existed.

Explanations for this have generally suggested that the misperception of a link resulted from a campaign of innuendo carried out by the Bush administration that explicitly and implicitly linked Saddam with Al Qaeda. For example, Gershkoff and Kushner (2005:525) argue that “the Bush administration successfully convinced [a majority of the public] that a link existed between Saddam Hussein and terrorism generally, and between Saddam Hussein and Al Qaeda specifically.” We characterize this explanation as being about the information environment : it implies that if voters had possessed the correct information, they would not have believed in the link. Underlying this explanation is a psychological model of information processing that scholars have labeled “Bayesian updating,” which envisions decision makers incrementally and rationally changing their opinions in accordance with new information (Gerber and Green 1999).

In this article we present data that contest this explanation, and we develop a social psychological explanation for the belief in the link between Saddam and Al Qaeda. We argue that the primary causal agent for misperception is not the presence or absence of correct information but a respondent’s willingness to believe particular kinds of information. Our explanation draws on a psychological model of information processing that scholars have labeled motivated reasoning. This model envisions respondents as processing and responding to information defensively, accepting and seeking out confirming information, while ignoring, discrediting the source of, or arguing against the substance of contrary information (DiMaggio 1997; Kunda 1990; Lodge and Tabor 2000). Motivated reasoning is a descendant of the social psychological theory of cognitive dissonance (Festinger and Carlsmith 1959; Kunda 1990), which posits an unconscious impulse to relieve cognitive tension when a respondent is presented with information that contradicts preexisting beliefs or preferences. Recent literature on motivated reasoning builds on cognitive dissonance theory to explain how citizens relieve cognitive dissonance: they avoid inconsistency, ignore challenging information altogether, discredit the information source, or argue substantively against the challenge (Jobe, Tourangeau, and Smith 1993; Lodge and Taber 2000; Westen et al. 2006). The process of substantive counterarguing is especially consequential, as the cognitive exercise of generating counterarguments often has the ironic effect of solidifying and strengthening the original opinion leading to entrenched,polarized attitudes (Kunda 1990; Lodge and Taber 2000; Sunstein 2000; Lodge and Taber 2000). This confirmation bias means that people value evidence that confirms their previously held beliefs more highly than evidence that contradicts them, regardless of the source (DiMaggio 1997; Nickerson 1998, Wason 1968).

We also draw on social psychological theories that focus on the use of heuristics, decision-making shortcuts that allow respondents to avoid time- and resource-intensive processes of rational deliberation. Within political psychology, scholars have shown that heuristics such as party, ideology, endorsements, opinion polls, and candidate appearance allow voters to evaluate a candidate quickly by matching an easily available external marker to preferences held by the voter, without investing the time necessary to investigate the background, preferences, and positions of every individual candidate (Lau and Redlawsk 2001; see also Popkin 1994)....

Related Links

  • Newsweek story about this Journal article
  • Read entire article at Sociological Inquiry (journal)