31. May 2010 · 5 comments · Categories: psychology

I’m fascinated by the cognitive biases that affect people’s perceptions of climate change. I’ve previously written about the Dunning-Kruger effect (the least competent people tend to vastly over-rate their competence), and Kahan and Braman’s studies on social epistemology (people tend to ignore empirical evidence if its conclusions contradict their existing worldview).

Now comes a study by Nyhan and Reifler in the journal Political Behaviour entitled “When Corrections Fail: The persistence of Political Misperceptions“. N&R point out that in the literature, there is plenty of evidence that people often “base their policy preferences on false, misleading or unsubstantiated information that they believe to be true”. Studies have shown that providing people with correct factual information doesn’t necessarily affect their beliefs. However, different studies disagree about the latter point, partly because it’s often not clear in these studies whether the subject’s beliefs changed at all, and partly because previous studies have differed over how the factual information is presented (and even what counts as ‘factual’).

So N&R set out to directly study whether corrective information does indeed change erroneous beliefs. Most importantly, they were interested in what happens when this corrective information isn’t presented directly as an authoritative account of the truth, but rather (as happens more often) when it is presented as part of a larger, more equivocal set of stories in the media. One obvious factor that causes people to preserve erroneous beliefs is through selective reading – people tend to seek out information that supports their existing beliefs; hence they often don’t encounter information that corrects their misperceptions. And even when people do encounter corrective information, they are more likely to reject it (e.g. by thinking up counter-arguments) if it contradicts their prior beliefs. It is this latter process that N&R investigated, and in particular, whether this process of thinking up counter-arguments can actually reinforce the misperception; they dub this a “correction backfire”.

Four studies were conducted. In each case, the subjects were presented with a report of a speech by a well-known public figure. When a factual correction was added to the article, those subjects who are most likely to agree with the contents of the speech were unmoved by the factual correction, and in several of the studies, the correction actually strengthened their belief in the erroneous information (i.e. a clear ‘correction backfire’):

  • Study 1 conducted in the fall of 2005, examined beliefs in whether Iraq had weapons of mass destruction prior to the US invasion. Subjects were presented with a newspaper article describing a speech by president Bush in which he talks about the risk of Iraq passing these weapons on to terrorist networks. In the correction treatment, the article goes on to describe the results of the Duelfer report, which concluded there were, in fact, no weapons of mass destruction. The result shows a clear correction backfire for conservatives – the correction significantly increased their belief that Iraq really did have such weapons, while for liberals, the correction clearly decreased their belief.
  • Study 2, conducted in the spring of 2006 repeats study 1, with some variation in the wording. This study again showed that the correction was ineffective for conservatives – it didn’t decrease their belief in the existence of the weapons. However, unlike study 1, it didn’t show a correction backfire, although a re-analysis of the results indicated that there was such a backfire among those conservatives who most strongly supported the Iraq war. This study also attempted to test the effect of the source of the newspaper report – i.e. does it matter if it’s presented as being from the New York Times (perceived by many conservatives to have a liberal bias) or Fox News (perceived as being conservative)? In this case, the source of the article made no significant difference.
  • Study 3, also conducted in 2006, examined the belief that the Bush tax cuts paid for themselves by stimulating enough economic growth to actually increase government revenues. Subjects were presented with an article in which president Bush indicated the tax cuts had helped to increase revenues. In the correction treatment, the article goes on to present the actual revenues, showing that tax revenues declined sharply (both absolutely, and as a proportion of GDP) in the years after the tax cuts were enacted. Again there was a clear correction backfire among conservatives – those receiving the article presenting the actual revenues actually increased their belief that the tax cuts paid for themselves.
  • Study 4, also from 2006, examined the belief that Bush banned stem cell research. Subjects were presented with an article describing speeches by senators Edwards and Kerry in which they suggest such a ban exists. In the corrective treatment, a paragraph was added to explain that Bush didn’t actually ban stem cell research, because his restrictions didn’t apply to privately funded research. The results were that the correction did not change liberal’s belief that there was such a ban, but there was no correction backfire (i.e. it didn’t increase their beliefs in the ban).

In summary, factual corrections in newspaper articles don’t appear to work for those who are ideologically motivated to hold the misperception, and in two out of the four studies, it actually strengthened the misperception. So, fact-checking on its own is not enough to overcome ideologically-driven beliefs. (h/t to Ben Goldacre for this)

How does this relate to climate change? Well, most media reports on climate change don’t even attempt any fact-checking anyway – they ignore the vast body of assessment reports by authoritative scientific bodies, and present a “he-said-she-said” slugfest between denialists and climate scientists. The sad thing is that the addition of fact-checking won’t, on it’s own, make any difference to those whose denial of climate change is driven by their ideological leanings. If anything, such fact-checking will make them even more entrenched…

Dear God, I would like to file a bug report

(clickety-click for the full xkcd cartoon)

5 Comments

  1. Steve, the link to the study is broken. Here’s the full PDF.

    http://www-personal.umich.edu/~bnyhan/nyhan-reifler.pdf

    Thanks for posting this. Next to my layman’s interest in the science, this topic is most fascinating to me.

  2. “…a “correction backfire”.”

    I think this is also called “rescue bias”.

    My Q on all of these – in his book either Jonah Lehrer or Malcolm Gladwell (don’t read both on successive days!) had a word-association exercise that served to reveal one’s subconscious prejudices, blatantly & unambiguously. Is there something similar, that could be crafted, to reveal our other cognitive bugs – or at least, the delta between the rational being we think we are and the irrational one we actually are?

    Because something like this could help to open some minds.

  3. …although rescue bias doesn’t touch on the *strengthened* belief.

    I wonder about that though, if it’s not akin to a “dead cat bounce”, or – from (the very excellent) Don’t Shoot the Dog – an extinction burst.

  4. See also: casuistic reasoning
    http://www.psychologytoday.com/blog/alternative-truths/201006/the-persistent-illusion-impartiality
    (“we have sophisticated methods of hiding our biases…[In an experiment, students] ranked the relative importance of the selection criteria to support that [pre-existing] preference”)

  5. Hi Steve, Of related interest is a 2006 study by Anthony Leiserowitz: “Climate Change Risk Perception and Policy Preferences: The Role of Affect, Imagery, and Values.”
    http://www.cred.columbia.edu/pdfs/publications/Leiserowitz_ClimaticChange2006.pdf

    Leiserowitz surveys for holistic affect (“Do you have any negative feelings about global warming?”), affective imagery (“What is the first thought or image that comes to your mind when you think of global warming?”), and values derived from Cultural Theory (with similarities to Kahan).

    Among the discussion:
    “[T]he analyses found that negative affect, imagery (naysayers) and values (egalitarian) were consistently stronger predictors of risk perception and policy preferences than all sociodemographic variables, including political party identification and ideology (liberal-conservative).”

    More about Cultural Theory:
    http://peopleandplace.net/media_library/image/2010/3/9/climate_worldviews_and_cultural_theory

Join the discussion: