28. May 2011 · 4 comments · Categories: psychology

This xkcd cartoon made me laugh out loud, because it beautifully captures my attitude towards sports. The mouse-over comment on xkcd points out that it applies to financial analysis too, and (more directly), Dungeons and Dragons:

But then I got thinking. Huge numbers of people are fascinated by the narratives that these commentators produce. There’s something compelling about declaring allegiance to one of the weighted random number generators (sports teams, stock picks, etc), selecting which of the narratives to believe based on that allegiance, and then hoping for (and perhaps betting on) which numbers it will produce. Sometimes the numbers turn out the way you’d hoped, and sometimes they don’t, but either way people prefer to believe in the narratives, rather than acknowledge the randomness. There’s a thrill to the process, but at the same time, a strong sense that everyone else’s narratives are delusional.

What if that’s how most people view science? What if they think that scientists are no different from sports commentators and financial analysts, sounding knowledgeable, but mostly just making up explanations based on wishful thinking? What if people believe that scientists’ explanations are just as unreliable as those of sports commentators, and that therefore you can pick what to believe based on tribal affiliations, rather than on, say, the weight of evidence?

Certainly, that’s how many non-scientist commentators approach climate science. Each new paper published is like a new game outcome. The IPCC team may have won a lot of games in the past, put it’s very unpopular among some groups of fans. Results that suggest some other teams are on the rise can be spun into fabulous narratives about how the IPCC team is past its prime, and cruising towards a big upset.

For those who have no idea what science does, the caption for the cartoon might just as well read “All climate science”.

31. May 2010 · 5 comments · Categories: psychology

I’m fascinated by the cognitive biases that affect people’s perceptions of climate change. I’ve previously written about the Dunning-Kruger effect (the least competent people tend to vastly over-rate their competence), and Kahan and Braman’s studies on social epistemology (people tend to ignore empirical evidence if its conclusions contradict their existing worldview).

Now comes a study by Nyhan and Reifler in the journal Political Behaviour entitled “When Corrections Fail: The persistence of Political Misperceptions“. N&R point out that in the literature, there is plenty of evidence that people often “base their policy preferences on false, misleading or unsubstantiated information that they believe to be true”. Studies have shown that providing people with correct factual information doesn’t necessarily affect their beliefs. However, different studies disagree about the latter point, partly because it’s often not clear in these studies whether the subject’s beliefs changed at all, and partly because previous studies have differed over how the factual information is presented (and even what counts as ‘factual’).

So N&R set out to directly study whether corrective information does indeed change erroneous beliefs. Most importantly, they were interested in what happens when this corrective information isn’t presented directly as an authoritative account of the truth, but rather (as happens more often) when it is presented as part of a larger, more equivocal set of stories in the media. One obvious factor that causes people to preserve erroneous beliefs is through selective reading – people tend to seek out information that supports their existing beliefs; hence they often don’t encounter information that corrects their misperceptions. And even when people do encounter corrective information, they are more likely to reject it (e.g. by thinking up counter-arguments) if it contradicts their prior beliefs. It is this latter process that N&R investigated, and in particular, whether this process of thinking up counter-arguments can actually reinforce the misperception; they dub this a “correction backfire”.

Four studies were conducted. In each case, the subjects were presented with a report of a speech by a well-known public figure. When a factual correction was added to the article, those subjects who are most likely to agree with the contents of the speech were unmoved by the factual correction, and in several of the studies, the correction actually strengthened their belief in the erroneous information (i.e. a clear ‘correction backfire’):

  • Study 1 conducted in the fall of 2005, examined beliefs in whether Iraq had weapons of mass destruction prior to the US invasion. Subjects were presented with a newspaper article describing a speech by president Bush in which he talks about the risk of Iraq passing these weapons on to terrorist networks. In the correction treatment, the article goes on to describe the results of the Duelfer report, which concluded there were, in fact, no weapons of mass destruction. The result shows a clear correction backfire for conservatives – the correction significantly increased their belief that Iraq really did have such weapons, while for liberals, the correction clearly decreased their belief.
  • Study 2, conducted in the spring of 2006 repeats study 1, with some variation in the wording. This study again showed that the correction was ineffective for conservatives – it didn’t decrease their belief in the existence of the weapons. However, unlike study 1, it didn’t show a correction backfire, although a re-analysis of the results indicated that there was such a backfire among those conservatives who most strongly supported the Iraq war. This study also attempted to test the effect of the source of the newspaper report – i.e. does it matter if it’s presented as being from the New York Times (perceived by many conservatives to have a liberal bias) or Fox News (perceived as being conservative)? In this case, the source of the article made no significant difference.
  • Study 3, also conducted in 2006, examined the belief that the Bush tax cuts paid for themselves by stimulating enough economic growth to actually increase government revenues. Subjects were presented with an article in which president Bush indicated the tax cuts had helped to increase revenues. In the correction treatment, the article goes on to present the actual revenues, showing that tax revenues declined sharply (both absolutely, and as a proportion of GDP) in the years after the tax cuts were enacted. Again there was a clear correction backfire among conservatives – those receiving the article presenting the actual revenues actually increased their belief that the tax cuts paid for themselves.
  • Study 4, also from 2006, examined the belief that Bush banned stem cell research. Subjects were presented with an article describing speeches by senators Edwards and Kerry in which they suggest such a ban exists. In the corrective treatment, a paragraph was added to explain that Bush didn’t actually ban stem cell research, because his restrictions didn’t apply to privately funded research. The results were that the correction did not change liberal’s belief that there was such a ban, but there was no correction backfire (i.e. it didn’t increase their beliefs in the ban).

In summary, factual corrections in newspaper articles don’t appear to work for those who are ideologically motivated to hold the misperception, and in two out of the four studies, it actually strengthened the misperception. So, fact-checking on its own is not enough to overcome ideologically-driven beliefs. (h/t to Ben Goldacre for this)

How does this relate to climate change? Well, most media reports on climate change don’t even attempt any fact-checking anyway – they ignore the vast body of assessment reports by authoritative scientific bodies, and present a “he-said-she-said” slugfest between denialists and climate scientists. The sad thing is that the addition of fact-checking won’t, on it’s own, make any difference to those whose denial of climate change is driven by their ideological leanings. If anything, such fact-checking will make them even more entrenched…

Dear God, I would like to file a bug report

(clickety-click for the full xkcd cartoon)

Kate asked the question last week “How do you stay sane” (while fighting the misinformation campaigns and worrying about our prospects for averting dangerous climate change). Kate’s post reminded me of a post I did last year on climate trauma, and specifically the essay by Gillian Caldwell, in which she compares the emotional burnout that many of us feel when dealing with climate change with other types of psychological trauma. I originally read this at a time when I was overdoing it, working late into the evenings, going to bed exhausted, and then finding myself unable to sleep because my head was buzzing with everything I’d just been working on. Gillian’s essay struck a chord.

I took on board many of the climate trauma survival tips, and in particular, I started avoiding climate related work in the evenings. My blogging rate went down and I started sleeping and exercising properly again. But good habits can be hard to maintain, and I realise in the last few months I was overdoing it again. As it was March break last week, we took a snap decision to take some time off, and took the kids skiing in Quebec. We even managed to fit in trips to Ottawa and Montreal en route, as the kids hadn’t been to either city.

The trip was great, but wasn’t 100% effective as a complete break. I was reminded of climate change throughout: I didn’t need a coat in Ottawa (in March!!) and we picnicked outdoors in Montreal (in March!!). There’s no snow left in the Laurentides (except on the ski slopes); and we found ourselves skiing in hot sunshine (which meant by mid-afternoon the slopes were covered in piles of wet slush). The ski operators told us they normally stay open through mid-April, but that looks extremely unlikely this year. And sure enough, I return to the news that Canada has experienced the warmest winter ever recorded, and we’re on course for the hottest year ever. It can’t be good news for the ski industry.

And it’s not good news for me  because I’m now back to blogging late into the evening again…

…is a section heading on page 23 of this new report, “Psychology and Global Climate Change: Addressing a Multi-faceted Phenomenon and Set of Challenges” from the APA on how the field of psychology can contribute to the climate crisis. It’s a very good report, covering many of the core issues in the psychology of climate change, and laying out a research agenda for the field. Let me just quote the final paragraph of the report:

“a psychological perspective is crucial to understanding the probable effects of climate change, to reducing the human drivers of climate change, and to enabling effective social adaptation.  By summarizing the relevant psychological research, we hope not only to enhance recognition of the important role of psychology by both psychologists and non-psychologists, but also to encourage psychologists to be more aware of the relevance of global climate change to our professional interests and enable them to make more of the contributions the discipline can offer.”

Or, if you’re short of time, just read the press release.

Now, where’s the equivalent task force for the computer science community?

I’ve spent some time pondering why so many people seem unable or unwilling to understand the seriousness of climate change. Only half of all Americans understand that warming is happening because of our use of fossil fuels. And clearly many people still believe the science is equivocal. Having spent many hours arguing with denialists, I’ve come to the conclusion that they don’t approach climate change in a scientific way (even those who are trained as scientists), even though they often appear to engage in scientific discourse. Rather than assessing all the evidence and trying to understand the big picture, climate denialists start from their preferred conclusion and work backwards, selecting only the evidence that supports the conclusion.

But why? Why do so many people approach global warming in this manner? Previously I speculated that the Dunning-Kruger effect might explain some of this. This effect occurs when people at the lower end of the ability scale vastly overestimate their own competence. Combine this with the observation that few people really understand the basic system dynamics, for example that concentrations of greenhouse gases in the atmosphere will continue to rise even if emissions are reduced, as long as the level of emissions (burning fossil fuels) exceeds the removal processes (e.g. sequestration by the oceans). The Dunning-Kruger effect suggests that people whose reasoning is based on faulty mental models are unlikely to realise it.

While incorrect mental models and overconfidence might explain some of the problem that people have in accepting the scale and urgency of the problem, it doesn’t really explain the argumentation style of climate denialists, particularly the way in which they latch onto anything that appears to be a weakness or an error in the science, while ignoring the vast majority of the evidence in the published literature.

However, a series of studies by Kahan, Braman and colleagues explain this behaviour very well. In investigating a key question in social epistemology, Kahan and Braman set out to study why strong political disagreements seem to persist in many areas of public policy, even in the face of clear evidence about the efficacy of certain policy choices. These studies reveal a process they term cultural cognition, by which people filter (scientific) evidence according to how well it fits their cultural orientation. The studies explore this phenomenon for contentious issues such as the death penalty, gun control and environmental protection, as well as issues that one might expect would be less contentious, such as immunization and nanotechology. It turns out that not only do people care about how well various public policies cohere with their existing cultural worldviews, but their beliefs about the empirical evidence are also derived from these cultural worldviews.

For example, in a large scale survey, they tested people’s attitudes to the perception of risks from global warming, gun ownership, nanotechnology and immunization. They assessed how well these perceptions correlate with a number of characteristics, including gender, education, income, political affiliation, and so on. While political party affiliation correlates well with attitudes on some of these issues, there was a generally stronger correlation across the board with the two dimensions of cultural values identified by Douglas and Wildavsky: ‘group’ and ‘grid’. The group dimension assesses whether people are more oriented towards individual needs (‘individualist’) or the needs of the group (‘communitarian’); and the grid dimension assesses whether people tend to believe societal roles should be well defined and differentiated (‘hierarchical’) or those who believe in more equality and less rigidity (‘egalitarian’).

The most interesting part of the study, for me, is a an experiment on how perceptions change depending on how the risk of global warming is presented. About 500 subjects were given one of two different newspaper articles to read, both of which summarized the findings of a scientific report about the threat of climate change. In one version, the scientists were described as calling for anti-pollution regulations, while in the other, they were calling for investment in more nuclear power. Both these were compared with a control group who saw neither version of the report. Here are the results (adapted from Kahan et al, with a couple of corrections supplied by the authors):

Kahan-etal-fig3aKahan-etal-fig3b

In all cases, the mean risk assessment of the subjects correlates with their position on these dimensions: individualists and hierarchs are much less worried about global warming than communitarians and egalitarians. But more interestingly, the two different newspaper articles affect these perceptions in different ways. For the article that described scientists as calling for anti-pollution measures, people had quite opposite reactions: for communitarians and egalitarians, it increased their perception of the risk from global warming, but for individualists and hierarchs, it decreased their perception of the risk. When the same facts about the science are presented in an article that calls for more nuclear power, there is almost no effect. In other words, people assessed the facts in the report about climate change according to how well the policy prescription fits with their existing worldview.

There are some interesting consequences of this phenomenon. For example, Kahan and Braman argue that there is really no war over ideology in the US, just lots of people with well-established cultural worldviews, who simply decide what facts (scientific evidence) to believe based on these views. The culture war is therefore really a war over facts, not ideology.

The studies also suggest that certain political strategies are doomed to failure. For example, a common strategy when trying to resolve contentious political policy issues is to attempt to detach the policy question from political ideologies, and focus on the available evidence about the consequences of the policy. Kahan and Braman’s studies show this won’t work, because different cultural worldviews prevent people from agreeing what the consequences of a particular policy will be (no matter what empirical evidence is available). Instead, they argue that policymakers must find ways of framing policy so that affirm the values of diverse cultural worldviews simultaneously.

As an example, for gun control, they suggest offering a bounty (e.g. a tax rebate) for people who register handguns. Both pro- and anti- gun control groups might view this as beneficial to them, even though they disagree on the nature of the problem. For climate change, the equivalent policy prescriptions include tradeable emissions permits (which appeal to individualists and hierarchists), and more nuclear power (which egalitarians and hierarchists tend to view as less risky when presented as a solution to global warming).

Update: There’s a very good opinion piece by Kahan in the January 21, 2010 issue of Nature.

11. June 2009 · 3 comments · Categories: psychology

I’ve been thinking a lot recently about why so few people seem to understand the severity of the climate crisis. Joe Romm argues that it’s a problem of rhetoric: the deniers tend to be excellent at rhetoric, while scientists are lousy at it. He suggests that climate scientists are pretty much doomed to lose in any public debates (and hence debates on this are a bad idea).

But even away from the public stage, it’s very frustrating trying to talk to people who are convinced climate change isn’t real, because, in general, they seem unable to recognize fallacies in their own arguments. One explanation seems to be the Dunning-Kruger effect – a cognitive bias in people’s subjective assessment of their (in)competence. The classic paper is “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments” (for which title, Dunning and Kruger were awarded an IgNoble award in 2000 🙂 ). There’s also a brilliant youtube movie that explains the paper. The bottom line is that people who are most wrong are the least able to perceive it.

In a follow-up paper, they describe a number of experiments that investigate why people fail to recognize their own incompetence. Turns out one of the factors is that people take a “top-down” approach in assessing their competence. People tend to assess how well they did in some task based on their preconceived notions of how good they are at the skills needed, rather than any reflection on their actual performance. For example, in one experiment, the researchers gave subjects a particular test. Some were told it was a test of abstract reasoning ability (which the subjects thought they were good at). Others were told it was a test of programming ability (which the subjects thought they were bad at). It was, of course, the same test, and the subjects from both groups did equally well on it. But their estimation of how well they did depended on what kind of test they thought it was.

There’s also an interesting implication for why women tend to drop out of science and technology careers – women tend to rate themselves as less scientifically talented as men, regardless of their actual performance. This means that, when women are performing just as well as their male colleagues, they will still tend to rate themselves as doing less well, because of this top-down assessment bias.

To me, the most interesting part of the research is a whole bunch of graphs that look like this:

dunning-curve1

People who are in the bottom quartile of actual performance tend to dramatically over-estimate how well they did. People in the top quartile tend to slightly under-estimate how well they did.

This explains why scientists have a great difficulty convincing the general public about important matters such as climate change. The most competent scientists will systematically under-estimate their ability, and will be correspondingly modest when presenting their work. The incompetent (e.g. those who don’t understand the science) will tend to vastly over-inflate their professed expertise when presenting their ideas. No wonder the public can’t figure out who to believe. Furthermore, people who just don’t understand basic science also tend not to realize it.

Which also leads me to suggest that if you want to find the most competent people, look for people who tend to underestimate their abilities. It’s rational to believe people who admit they might be wrong.

Disclaimer: psychology is not my field – I might have completely misunderstood all this.

Lately I’ve been advocating for smart people to start asking themselves how their special skills and expertise can be adapted to the challenge of climate change. And for them to get involved and do something. And I don’t just mean dabble around with trying to live a greener lifestyle. I mean to jump in completely and devote their careers to this. Because this is a planetary emergency, and we need a massive brain gain to address it. And because we have a moral obligation to act. (Wish me luck: I’ll be pitching this message to software engineers next week).

But having immersed myself in the climate science for the last couple of years, I’m also aware of a huge cognitive dissonance. It’s like this incredible horrifying secret: the climate scientists have mapped out an apocalyptic future, demonstrating the urgency and the magnitude of the challenge, and have even calculated the probability factors. But most of the rest of the world is blissfully unaware. They carry on living their lives, burning through fossil fuels like there’s no tomorrow. Why is it not in the papers every day? Why do politicians make speeches and conduct election campaigns with barely a mention of it? Why aren’t there protest marches and sit-ins and hunger strikes?

I frequently meet people who don’t want to know. Some of them have convinced themselves its not happening. More often they treat it as some vague future threat that they’re too busy to worry about right now (and after all, they have changed their lightbulbs already). And some admit it’s too scary to talk about. Almost none of them are willing to take the time and explore what the climate scientists have to say.

And I have to admit, all of these people probably sleep better than I do. They might even be making good rational choices. Because if you spend too long immersed in the science and politics of climate change, there’s a serious danger of “climate trauma”, which appears to be as serious as other kinds of trauma. Gillian Caldwell discusses this at length, and has a bunch of excellent tips to deal with it. Add that to the tips from the Australian Psychological Association that Jon blogged about a few months ago. Because, if you’ve read this far, and want to get involved, you’ll need to heed this advice.

In our brainstorm session yesterday, someone (Faraz?) suggested I could kick off the ICSE session with a short video. The closest thing I can think of is this:

Wake Up, Freak Out – then Get a Grip

It’s not too long, it covers the recent science very well, and it is exactly the message I want to give – climate change is serious, urgent, demands massive systemic change, but is not something we should despair over. It also comes with a full transcript with detailed references into the primary scientific literature, which is well worth a browse.

Except that it scares the heck out of me every time I watch it. Could I really show this to an ICSE audience?

  1. Because their salaries depend on them not understanding. Applies to anyone working for the big oil companies, and apparently to a handful of “scientists” funded by them .
  2. Because they cannot distinguish between pseudo-science and science. Seems to apply to some journalists, unfortunately.
  3. Because the dynamics of complex systems are inherently hard to understand. Shown to be a major factor by the experiments Sterman did on MIT students.
  4. Because all of the proposed solutions are incompatible with their ideology. Applies to most rightwing political parties, unfortunately.
  5. Because scientists are poor communicators. Or, more precisely, few scientists can explain their work well to non-scientists.
  6. Because they believe their god(s) would never let it happen. And there’s also a lunatic subgroup who welcome it as part of god’s plan (see rapture).
  7. Because most of the key ideas are counter-intuitive. After all, a couple of degrees warmer is too small to feel.
  8. Because the truth is just too scary. There seem to be plenty of people who accept that it’s happening, but don’t want to know any more because the whole thing is just too huge to think about.
  9. Because they’ve learned that anyone who claims the end of the world is coming must be a crackpot. Although these days, I suspect this one is just a rhetorical device used by people in groups (1) and (4), rather than a genuine reason.
  10. Because most of the people they talk to, and most of the stuff they read in the media also suffers from some of the above. Selective attention allows people to ignore anything that challenges their worldview.

But I fear the most insidious is because people think that changing their lightbulbs and sorting their recyclables counts as “doing your bit”. This idea allow you to stop thinking about it, and hence ignore just how serious a problem it really is.