Saddam Hussein participated in 9/11. The U.S. was founded as a Christian nation. President Obama is a Muslim. How can we be so clueless? (More)

Clueless, Part II – Political Ignorance

This week Morning Feature looks at cluelessness and its causes. Yesterday we considered common ignorance. Today we examine political ignorance. Saturday we conclude with how to open our own and others’ eyes.

A 2003 USA Today poll showed 70% of Americans believed Saddam Hussein participated in the 9/11 attacks. You could dismiss that as war hysteria, and a week later President Bush said there was no evidence of such a link. But a CBS News/New York Times poll revealed that one-third of Americans still believed it … six years later. A 2007 First Amendment Center poll found that 55% believe the Constitution established a Christian nation, while the First Amendment says the opposite: “Congress shall make no law respecting an establishment of religion.” And a 2010 Pew Research poll showed that 18% believe President Obama is a Muslim … up from 12% in 2008.

Yesterday we saw that we’re all subject to common ignorance, but politics seems to breed exceptional ignorance. Why?

Actions and Factions

One reason for political ignorance is, ironically, political action. Research in cognitive dissonance shows we are more likely to defend a belief once we have acted on it. We want to be good people who make wise, rational decisions. And we want to believe that, even when later experience shows we may have been wrong. Unfortunately, we also have a laundry list of cognitive biases – from anchoring to zero-risk – that let us rationalize decisions and actions despite contrary evidence.

And because politics is factional, we can help each other be ignorant. Confirmation bias can easily confound us as individuals. In a 1979 Stanford University study, researchers found that individuals rated evidence as more reliable when it confirmed their beliefs. Moreover, they selectively relied on favorable evidence to strengthen their beliefs, even when presented contradictory evidence. But that trend is even more pronounced in groups, which tend to drift toward more extreme positions than the members’ original beliefs.

Add a wider selection of biased information through cable news and the Internet, and the clustering of like-minded people in what Bill Bishop calls The Big Sort, and it is increasingly possible to live on “information islands” where we see, read, hear, and discuss only evidence that confirms what we already believe.

It’s not Teh Stoopid

Pundits and activists often discuss political ignorance as if the ignorant – those who believe claims that are demonstrably false – are stupid. But research suggests that smart people are also prone to false beliefs. Worse, smart people are more likely to get stuck more deeply in those beliefs. Michael Shermer cites a study published in Skeptic that showed students who scored high on scientific knowledge tests were no less likely to believe pseudoscientific claims than students who scored poorly. The study’s authors concluded that, when it comes to science, “Students are taught what to think but not how to think.”

Psychologist and physician Edward de Bono argues that’s because of the intelligence trap:

A highly intelligent person will often take a certain view on a subject and then use his or her thinking just to support that view. This will be done with arguments that make a great deal of sense. But the more able a thinker is to support a point of view the less inclined is that thinker actually to explore the subject. Since the original point of view may be based on prejudice or habit, this failure to explore the subject is bad thinking.

In short, the more intelligent we are, the more easily we can construct a compelling argument … based on prejudice or habit rather than actual evidence. Worse:

Intelligent people who are not master thinkers do not like being wrong. Their ego and sense of personal worth has been built around their intelligence so it becomes very difficult to admit an error. This means that such people do all they can to avoid admitting an error.

Emotion and “Reason”

In unSpun, Brooks Jackson and Katherine Hall Jamieson of the Annenberg Public Policy Center discuss a 2004 Emory University study by psychologist Drew Westen. In that study, Dr. Westen conducted brain scans while subjects evaluated statements made by incumbent President George Bush and Democratic challenger John Kerry. He studied two groups, one with 15 Bush supporters and the other with 15 Kerry supporters:

Not surprisingly, each group judged the other’s candidate harshly but let its own candidate off fairly easy – clear evidence of bias. More interesting was what the brain scans showed. “We did not see any increased activation of the parts of the brain normally engaged during reasoning,” Westen said in announcing his results. “What we saw instead was a network of emotion circuits lighting up.”

Furthermore, after the partisans had come to conclusions favorable to their candidates, their brain scans showed activity in circuits associated with reward, rather as the brains of addicts do when they get a fix. “Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massive reinforced for it,” Westen said.

And our “reasons” often aren’t. In a 1978 study by Harvard psychologist Ellen Langer, researchers tried to cut in front of a line of people at a university copy machine. On some attempts they asked “Excuse me, may I use the Xerox machine?” without giving any reason. On other attempts they asked “Excuse me, may I use the Xerox machine, because I’m in a rush?” As expected, subjects were more likely to let them cut in line when they gave a reason (94%) versus when they gave no reason (60%).

But here’s the kicker: the reason didn’t matter.

The researchers were just as successful cutting in line when they said “Excuse me, may I use the Xerox machine, because I have to make some copies?” That’s really no “reason” at all. Why else would someone want to use the copy machine? Dr. Langer concluded that people assume what follows the word “because” will be a reason. She called it “mindlessness.”

Whether mindless or clueless, we should learn to be more skeptical. And tomorrow we’ll discuss how to do that.

+++++

Happy Friday!