Given the terrible internet connection and complete lack of power outlets in most meeting rooms, I’ve been reduced to taking paper and pencil notes, so I’m way behind in the blogging. So I’ll console myself with a quick tour of everyone else’s blogs about this AGU meeting:

First, and foremost, the AGU press office is putting together its own blog, coordinated by Mohi Kumar. It does a much better job of capturing the highlights than I do, partly because the staff writers can sample across all the sessions, and partly because they’re good at summarizing (rather than my own overly-detailed blow-by-blow accounts of talks).

There’s lots of talk here about impacts, especially on water. Water, Water Everywhere–Except There, and There, and There is a nice summary of some of the hydrology sessions, with the new results from GRACE being an obvious highlight. And the session on Climate Change in the West covered some pretty serious impacts on California.

But the biggest buzz so far at the meeting seems to be science literacy and how to deal with the rising tide of anti-science. Capitol Hill Needs Earth Scientists, reports on a workshop on communicating with Congress, and Talking about Climate: A Monday Night Town Hall Meeting tackled how to talk to the press. Both of these pick up on the idea that the science just isn’t getting through to the people who need to know it, and we have to fix this urgently. I missed both of these, but managed to attend the presentations this morning on science literacy: Can Scientists Convince the Growing Number of Global Warming Skeptics?, which included a beautifully clear and concise summary of the Six Americas study. I’ll post my own detailed notes from this session soon. The shadow of the recently leaked CRU emails has come up a lot this week, in every case as an example of how little the broader world understands about how science is done. Oh, if only more people could come to the AGU meeting and hang out with climate scientists. Anyway, the events of the last few weeks lent a slight sense of desperation to all these sessions on communicating science.

And here’s something that could do with getting across to policymakers – a new study by Davis and Caldeira on consumption-based accounting – how much of the carbon emissions of the developing world are really just outsourced emissions from the developed world, as we look for cheaper ways to feed our consumption habits.

But there are good examples of how science communication should be done. For example, the Emiliani Lecture and the Tough Task of Paleoceanographers, is a great example of explaining the scientific process, in this case an attempt to unravel a mystery about changes in ocean currents around the Indonesian straits, due to varying El Nino cycles. The point of course, is that scientists are refreshingly open about it when they discover their initial ideas are wrong.

And last, but not least, everyone agrees that Richard Alley’s lecture on CO2 in the Earth’s History was the highlight of the meeting so far, even if it was scheduled to clash with the US Climate Change Research Program report on Impacts.

Some more highlights:

Yesterday afternoon, I managed to catch the Bjernes Lecture, which was given by Richard B. Alley: “The biggest Control Knob: Carbon Dioxide in Earth’s Climate History”. The room was absolutely packed – standing room only, I estimated at least 2,000 people in the audience. And it was easy to see why – Richard is a brilliant speaker, and he was addressing a crucial topic – an account of all the evidence we have of the role of CO2 throughout prehistory.
By way of introduction, he pointed out how many brains are in the room, and how much good we’re all doing. Although he characterizes himself as not being an atmospheric scientist, except perhaps by default, but as he looks more and more at paleo-geology, it becomes clear how important CO2 is. He has found that CO2 makes a great organising principle for his class on the geology of climate change at Penn State, because CO2 keeps cropping up everywhere anyway. So, he’s going to take us through the history to demonstrate this. His central argument is that we have plenty of evidence now (some of it very new) that CO2 dominates all other factors, hence “the biggest control knob” (later in the talk he extended the metaphor by referring to other forcings as fine tuning knobs).
He also pointed out that, from looking at the blogosphere, it’s clear we live in interesting times, with plenty of people WILLING TO SHOUT and distort the science. For example, he’s sure some people will distort his talk title because, sure, there are other things that matter in climate change than CO2. As an amusing aside, he showed us a chunk of text from an email sent to the administrators at his university, on which he was copied, the gist of which is that his own research proves CO2 is not the cause of climate change, and hence he is misrepresenting the science and should be dealt with severely for crimes against the citizens of the world. And then he points out a fundamental error of logic in the first sentence of the email, to illustrate the level of ignorance about CO2 that we’re faced with.
So the history: 4.6 billions years ago, the suns output was lower (approx 70% of today’s levels), often referred to as the faint young sun. But we know there was liquid water on earth back then, so there must have been a stronger greenhouse effect. Nothing else explains it – orbital differences for example weren’t big enough. The best explanation for the process so far is the Rock-Weathering Thermostat. When it’s too cold, CO2 builds up to warm up. Turn up the temperature, and the sequestration of CO2 in the rocks goes faster. This is probably what has kept the planet in the right range for liquid water and life for 4 billion years.
But can we demonstrate it? The rock thermostat takes millions of years to work, because the principle mechanism is geological. One consequence is that the only way to get to a “snowball earth” is that some other cause of change has to happen fast – faster than the rock-themostat effect.
One obvious piece of evidence is in the rock layers. Glacial layers are always covered from above by carbonate rocks, hence carbonation follows icing. This shows part of the mechanism. But to explore this properly, we need good CO2 paleo-barometers. The gold standard is ice core record. So far the oldest ice core record is 800,000 years, although we only have one record this old. Several records go back 450K years, and many more shorter records. The younger samples all overlap, giving some confidence that they are correct. We also now know a lot about how to sort out ‘good’ ice core record from poor (contaminated) ones.
But to back this up, there are other techniques with independent assumptions (but none as easy to analyze as ice cores). When they all agree, this gives us more confidence in the reconstructions. One example: growth of different plant species – higher CO2 gives preference to certain species. Similarly, different ways in which carbonate shells in the ocean grow, depending on pH of the ocean (which in turn is driven by atmospheric concentrations of CO2). Also the fossil-leaf stomata record. Stomata are the pores in leaves that allow them to breathe. Plants grow leaves with more pores when there is low CO2, to allow them to breathe better, and less when there is more CO2, to minimize moisture loss.
So, we have a whole bunch of different paths, none of which are perfect, but together work pretty well.
Now what about those other controllers, beyond the rock-thermostat effect? CO2 is raised by:
the amount of CO2 coming out of volcanoes
slower weathering of rock
less plant activity
less fossil burial.
Graph of reconstruction of CO2 over 400Ma to present. With Ice coverage mapped on (measured as how far down towards the equator the ice reaches. (Jansen 2007, fig 6.1).
251 million years ago, pretty much every animal dies – 95% of species wiped out. End-permian extinction. Widespread marine green sulfur bacteria that use H2S for photosynthesis. Hydrogen sulphide as a result kills most other life off. And it coincides with a warm period. Probably kicked off by greater vulcanisms (siberian traps). When the ocean is warm, it’s easy to starve it of oxygen; when it’s cold it’s well oxygenated.
The mid-cretaceous “saurian sauna” – no ice at sea level at poles. Again, CO2 is really high again. High CO2 explains the warmth, although the models make it a little too warm at the CO2 levels.
One more blip before the ice ages. Co2 didn’t kill the dinosaurs, a meteorite did. Paleocene-eocene thermal maximum. Big temperature change. Already hot, world gets even hotter. Most sea-floor life dies out. Acidic ocean. Models have difficulty simulating as much warming. Happens very fast, although the recovery process matches our carbon cycle models very well. Shows up everywhere: e.g. leaf damage in fossil leaves at PETM.
But a mystery: temperature and CO2 highly correlated, with no other way to explain it. Occasionally there were places where temperature changes did not match CO2 changes. But over last couple of decades, these have all gone, as we’ve refined our knowledge of the CO2 record. The mismatches have mostly dissapeared.
2 years would have said something wrong in miocene, but today looks better. Two years ago got new records that improve the match. Two weeks ago, Tripati et al published a new dataset that agrees even better. So, two years ago, Miocene anomalies looked important, now not so clear – looks like CO2 and temperature do track.
The changes of temperature were predicted 50 years ahead of data that orbital issues were the cause. But what do we say to people who say the lag proves current warming isn’t caused by CO2.
Temperature never goes far without the CO2, and vice versa, but sometimes one lags the other by about 2 centuries. Analogy: credit card interest lags debt. By the denialist logic, because interest lags debt then I never have to worry about interest and the credit card company can never get me. Simple numerical model demonstrates that interest causes the debt. So, it’s basic physics. The orbits initially kick off the warming, but the CO2 then kicks in and drives it.
Explanation of ice-age cycling: A David Archer cocktail.
So, CO2 explains almost all the historical temperature change.
What’s left:
– Solar irradiance changes, volcanic changes. When these things change, we do see the change in the temperature record. For solar changes, there clearly aren’t many. (“if volcanoes could get organised, they could rule the world” – luckily they aren’t organised). Occasionally several volcanoes erupting together makes a bigger change, but again a rare even.
Solar changes look much more like a fine tuning knob, rather than a major control.
40,000 years ago the magnetic field stopped, letting in huge amounts of cosmic rays, but the climate ignored it. Hence, at best a fine tuning knob.
Space dust hasn’t changed much over time and there isn’t much of it.
What about climate sensitivity?
Sensitivity from models matches the record well (approx 3C per doubling of CO2).
Interesting experiment – Royer et al Nature 29 March 2007 vol 446.
Hence paleoclimate says more extreme claims about sensitivity must be wrong.
Hence, if CO2 doesn’t warm, then we have to explain why the physicists are stupid, and then we still have no other explanation. If there is a problem it is that occasionally the world seems a little more sensitive to CO2 than the models say.
Lots of possible fine-tuning knobs that might explain these – lots of current research looking into it.
Oh, and this is a global story not a regional one. Lots of local effects on regional climate.
Note that most of these recent discoveries haven’t percolated to IPCC yet. Current science says CO2 is the most important driver of climate throughout the earth’s history.
Q: If we burn all the available fossil fuel reserves, where do we get to? If you burn it all at once, some chance of getting above the cretaceous level, but lots of uncertainty, including how much reserves there really are. In a “burn it all” future, it’s likely to get really hot – +6 to 7C.
Q: We know if we stop emitting, it will slow down global warming. But from geology, what do we know about removal? A: Anything that increases weatherabilty of rocks. Can we make it go fast enough to make a difference, at an economic level. Eg how much energy would we need to do it – dig up shells from ocean bed and allow them to carbonate. Probably easier to keep it out of the air than to take it out of the air.
Q: is there a relationship between ocean acidity and atmospheric CO2.
Q about feedbacks. As we put up more CO2, the oceans take up about half of it. As the world gets warmer, the ability of the ocean to buffer that reduces. Biggest concern is probably in changes in the arctic soils. Methane on the seafloor. As you go from decades to centuries, the sensitivity to CO2 goes up a little, because of these amplfying feedbacks.

Yesterday afternoon, I managed to catch the Bjerknes Lecture, which was given by Richard B. Alley: “The biggest Control Knob: Carbon Dioxide in Earth’s Climate History”. The room was absolutely packed – standing room only, I estimated at least 2,000 people in the audience. And it was easy to see why – Richard is a brilliant speaker, and he was addressing a crucial topic – an account of all the lines of evidence we have of the role of CO2 in climate changes throughout prehistory.

[Update: the AGU has posted the video of the talk]

By way of introduction, he pointed out how many brains are in the room, and how much good we’re all doing. Although he characterizes himself as not being an atmospheric scientist, except perhaps by default, but as he looks more and more at paleo-geology, it becomes clear how important CO2 is. He has found that CO2 makes a great organising principle for his class on the geology of climate change at Penn State, because CO2 keeps cropping up everywhere. So, he’s going to take us through the history to demonstrate this. His central argument is that we have plenty of evidence now (some of it very new) that CO2 dominates all other factors, hence “the biggest control knob” (later in the talk he extended the metaphor by referring to other forcings as fine tuning knobs).

He also pointed out that, from looking at the blogosphere, it’s clear we live in interesting times, with plenty of people WILLING TO SHOUT and distort the science. For example, he’s sure some people will distort his talk title because, sure, there are other things that matter in climate change than CO2. As an amusing aside, he showed us a chunk of text from an email sent by an alumnus to the administrators at his university, on which he was copied, the gist of which is that Alley’s own research proves CO2 is not the cause of climate change, and hence he is misrepresenting the science and should be dealt with severely for crimes against the citizens of the world. To the amusement of the audience, he points out a fundamental error of logic in the first sentence of the email, to illustrate the level of ignorance about CO2 that we’re faced with. Think about this: an audience of 2,000 scientists, all of whom share his frustration at such ignorant rants.

So the history: 4.6 billions years ago, the sun’s output was lower (approx 70% of today’s levels), often referred to as the faint young sun. But we know there was liquid water on earth back then, and the only thing that could explain that is a stronger greenhouse effect. Nothing else works – orbital differences, for example, weren’t big enough. The best explanation for the process so far is the Rock-Weathering Thermostat. CO2 builds up in the atmosphere over time from volcanic activity. As this CO2 warms the planet through the greenhouse effect, the warmer climate increases the chemical weathering of rock, which in turn removes carbon dioxide through the formation of calcium carbonate, that gets washed into the sea and eventually laid down as sediment. Turn up the temperature, and the sequestration of CO2 in the rocks goes faster. If the earth cools down this process slows, allowing CO2 to build up again in the atmosphere. This is process is probably what has kept the planet in the right range for liquid water and life for most of the last 4 billion years.

But can we demonstrate it? The rock thermostat takes millions of years to work, because the principle mechanism is geological. One consequence is that the only way to get to a “snowball earth” (times in the Cryogenian period when the earth was covered in ice even down to the tropics) is that some other cause of change has to happen fast – faster than the rock-themostat effect.

An obvious piece of evidence is in the rock layers. Glacial layers are always covered from above by carbonate rocks, showing that increased carbonation (as the earth warmed) follows periods of icing. This shows part of the mechanism. But to explore the process properly, we need good CO2 paleo-barometers. The gold standard is ice core record. So far the oldest ice core record is 800,000 years, although we only have one record this old. Several records go back 450,000 years, and there are many more shorter records. The younger samples all overlap, giving some confidence that they are correct. We also now know a lot about how to sort out ‘good’ ice core record from poor (contaminated) ones.

But to back the evidence from the ice cores, there are other techniques with independent assumptions (but none as easy to analyze as ice cores). When they all agree, this gives us more confidence in the reconstructions. One example: growth of different plant species – higher CO2 gives preference to certain species. Similarly, different ways in which carbonate shells in the ocean grow, depending on pH of the ocean (which in turn is driven by atmospheric concentrations of CO2). Also the fossil-leaf stomata record. Stomata are the pores in leaves that allow them to breathe. Plants grow leaves with more pores when there is low CO2, to allow them to breathe better, and less when there is more CO2, to minimize moisture loss.

So, we have a whole bunch of different paths, none of which are perfect, but together work pretty well. Now what about those other controllers, beyond the rock-thermostat effect? CO2 is raised by:

  • the amount of CO2 coming out of volcanoes
  • slower weathering of rock
  • less plant activity
  • less fossil burial.

He showed the graph reconstructing what we know of CO2 levels over the last 400 million years. Ice coverage is shown on the chart as blue bars, showing how far down towards the equator the ice reaches, and this correlates with low CO2 levels from all the different sources of evidence. 251 million years ago, pretty much every animal dies – 95% of marine species wiped out, in the end-permian extinction. Probable cause: rapid widespread growth of marine green sulfur bacteria that use H2S for photosynthesis. The hydrogen sulphide produced as a result kills most other life off. And it coincides with a very warm period. The process was probably kicked off by greater vulcanism (the siberian traps) spewing CO2 into the atmosphere. When the ocean is very warm, it’s easy to starve it of oxygen; when it’s cold it’s well oxygenated. This starvation of oxygen killed off most ocean life.

Fast forward to the mid-cretaceous “saurian sauna”, when there was no ice at sea level at poles. Again, CO2 is really high again. High CO2 explains the warmth (although in this case, the models tend to make it a little too warm at these CO2 levels). Then there was one more blip before the ice ages. (Aside: CO2 is responsible for lots of things, but at least it didn’t kill the dinosaurs, a meteorite did). The paleocene-eocene thermal maximum meant big temperature changes. It was already hot, and the world gets even hotter. Most sea-floor life dies out. Acidic ocean. This time, the models have difficulty simulating this much warming. And it happened very fast, although the recovery process matches our carbon cycle models very well. And it shows up everywhere: e.g. leaf damage in fossil leaves at PETM.

But for many years there was still a mystery: The temperature and CO2 levels are highly correlated throughout the earth’s history, and with no other way to explain the climate changes. But occasionally there were places where temperature changes did not match CO2 changes. Over last couple of decades, as we have refined our knowledge of the CO2 record, these divergences have gone. The mismatches have mostly dissapeared.

Even just two years ago, Alley would have said something was still wrong in miocene, but today it looks better. Two years ago, we got new records that improve the match. Two weeks ago, Tripati et. al. published a new dataset that agrees even better. So, two years ago, miocene anomalies looked important, now not so clear, it looks like CO2 and temperature do track.

But what do we say to people who say the lag (CO2 rises tend to lag behind the temperature rise) proves current warming isn’t caused by CO2? We know that orbital changes (the Milankovitch cycles) kick off the ice ages – this was predicted 50 years before we had data (in the 1970s) to back it up. But temperature never goes far without the CO2, and vice versa, but sometimes one lags the other by about 2 centuries. And a big problem with the Milankovich cycles is that they only explain a small part of the temperature changes. The rest is when CO2 changes kick in. Alley offered the following analogy: credit card interest lags debt. By the denialist logic, because interest lags debt, then I never have to worry about interest and the credit card company can never get me. However, a simple numerical model demonstrates that interest can be a bigger cause of overall debt in the long run (even though it lags!!). So, it’s basic physics. The orbits initially kick off the warming, but the release of CO2 then kicks in and drives it.

So, CO2 explains almost all the historical temperature change. What’s left? Solar irradiance changes, volcanic changes. When these things change, we do see the change in the temperature record. For solar changes, there clearly aren’t many, and they act lik a fine tuning knob, rather than a major control. 40,000 years ago the magnetic field almost stopped (it weakened to about 10% of its current level), letting in huge amounts of cosmic rays, but the climate ignored it. Hence, we know cosmic rays are at best a fine tuning knob. Volcanic activity is important, but essentially random (“if volcanoes could get organised, they could rule the world” – luckily they aren’t organised). Occasionally several volcanoes erupting together makes a bigger change, but again a rare event. Space dust hasn’t changed much over time and there isn’t much of it (Alley’s deadpan delivery of this line raised a chuckle from the audience).

So, what about climate sensitivity (i.e. the amount of temperature change for each doubling of CO2)? Sensitivity from models matches the record well (approx 3°C per doubling of CO2). Recently, Royer et al conducted an interesting experiment, calculating equilibrium climate sensitivity from models, and then comparing with the proxy records, to demonstrate that climate sensitivity has been consistent over the last 420 million years. Hence paleoclimate says that the more extreme claims about sensitivity (especially those claiming very low levels) must be wrong.

In contrast, if CO2 doesn’t warm, then we have to explain why the physicists are stupid, and then we still have no other explanation for the observations. If there is a problem, it is that occasionally the world seems a little more sensitive to CO2 than the models say. There are lots of possible fine-tuning knobs that might explain these – and lots of current research looking into it. Oh, and this is a global story not a regional one; there are lots of local effects on regional climate.

Note that most of these recent discoveries haven’t percolated to the IPCC yet – much of this emerged in the last few years since the last IPCC report was produced. The current science says that CO2 is the most important driver of climate throughout the earth’s history.

Some questions from the audience followed:

  • Q: If we burn all the available fossil fuel reserves, where do we get to? A: If you burn it all at once, there is some chance of getting above the cretaceous level, but lots of uncertainty, including how much reserves there really are. In a “burn it all” future, it’s likely to get really hot: +6 to 7C.
  • Q: We know if we stop emitting, it will slow down global warming. But from geology, what do we know about removal? A: Anything that increases weatherabilty of rocks. But seems unlikely that we can make it go fast enough to make a difference, at an economic level. Key question is how much energy we would we need to do it, e.g. to dig up shells from ocean bed and allow them to carbonate. It’s almost certainly easier to keep it out of the air than to take it out of the air (big round of applause from the audience in response to this – clearly the 2,000+ scientists present know this is true, and appreciate it being stated so clearly).
  • Q What about feedbacks? A: As we put up more CO2, the oceans take up about half of it. As the world gets warmer, the ability of the ocean to buffer that reduces. Biggest concern is probably in changes in the arctic soils. Methane on the seafloor. As you go from decades to centuries, the sensitivity to CO2 goes up a little, because of these amplfying feedbacks.

Well that was it – a one hour summary of how we know that CO2 is implicated as the biggest driver in all climate change throughout the earth’s history. What I found fascinating about the talk was the way that Alley brought together multiple lines of evidence, and showed how our knowledge is built up from a variety of sources. Science really is fascinating when presented like this. BTW I should mention that Alley is author of The Two-Mile Time Machine, which I now have to go and read…

Finally, a disclaimer. I’m not an expert in any of this, by any stretch of the imagination. If I misunderstood any of Alley’s talk, please let me know in the comments.

Well, my intention to liveblog from interesting sessions is blown – the network connection in the meeting rooms is hopeless. One day, some conference will figure out how to provide reliable internet…

Yesterday I attended an interesting session in the afternoon on climate services. Much of the discussion was building on work done at the third World Climate Conference (WCC-3) in August, which set out to develop a framework for provision of climate services. These would play a role akin to local, regional and global weather forecasting services, but focussing on risk management and adaptation planning for the impacts of climate change. Most important is the emphasis on combining observation and monitoring services with research and modeling services (both of which already exist) with a new climate services information system (I assume this would be distributed across multiple agencies across the world) and system of user interfaces to deliver the information in forms needed for different audiences. Rasmus at RealClimate discusses some of the scientific challenges.

My concern in reading the outcomes of the WCC this is that it’s all focussed on a one-way flow of information, with insufficient attention to understanding who the different users would be, and what they really need. I needn’t have worried – the AGU session demonstrated that there’s plenty of people focussing on exactly this issue. I got the impression that there’s a massive international effort quietly putting in place the risk management and planning tools needed for a us to deal with the impacts of a rapidly changing climate, but which is completely ignored by a media still obsessed with the “is it happening?” pseudo-debate. The extent of this planning for expected impacts would make a much more compelling media story, and one that matters, on a local, scale to everyone.

Some highlights from the session:

Mark Svboda from the National Drought Mitigation Centre at the University of Nebraska, talking about drought planning the in US. He pointed out that drought tends to get ignored compared to other kinds of natural disasters (tornados, floods, hurricanes), presumably because it doesn’t happen within a daily news cycle. However drought dwarfs the damage costs in the US from all other kinds of natural disasters except hurricanes. One problem is that population growth has been highest in regions most subject ot drought, especially in the southwest US. The NDMC monitoring program includes the only repository of drought impacts. Their US drought monitor has been very successful, but next generation of tools need better sources of data on droughts, so they are working on adding a drought reporter, doing science outreach, working with kids, etc. Even more important, is improving the drought planning process, hence a series of workshops on drought management tools.

Tony Busalacchi from the Earth System Science Interdisciplinary Centre at the University of Maryland. Through a series of workshops in the CIRUN project, they’ve identified the need for tools for forecasting, especially around risks such as sea level rise. Especially the need for actionable information, but no service currently provides this. Climate information system needed for policymakers, on scales of seasons to decades, providing tailorable to regions, and with ability to explore “what-if” questions. To build this, needs coupling of models not used together before, and the synthesis of new datasets.

Robert Webb from NOAA, in Boulder, on experimental climate information services to support risk management. The key to risk assessment is to understand it’s across multiple timescales. Users of such services do not distinguish between weather and climate – they need to know about extreme weather events, and they need to know how such risks change over time. Climate change matters because of the impacts. Presenting the basic science and predictions of temperature change are irrelevant to most people – its the impacts that matter (His key quote: “It’s the impacts, stupid!”). Examples: water – droughts and floods, changes in snowpack, river stream flow, fire outlooks, and planning issues (urban, agriculture, health). He’s been working with the Climate Change and Western Water Group (CCAWWG) to develop a strategy on water management. How to get people to plan and adapt? The key is to get people to think in terms of scenarios rather than deterministic forecasts.

Guy Brasseur from German Climate Services Center, in Hamburg. German adaption strategy developed by german federal government, which appears to be way ahead of the US agencies in developing climate services. Guy emphasized the need for seamless prediction – need a uniform ensemble system to build from climate monitoring of recent past and present, and forward into the future, at different regional scales and timescales. Guy called for an Apollo-sized program to develop the infrastructure for this.

Kristen Averyt from the University of Colorado, talking about her “Climate services machine” (I need to get hold of the image for this – it was very nice). She’s been running workshops for Colorado-specific services, with beakout sessions focussed on impacts and utility of climate information. She presented some evaluations of the success of these workshop, including a climate literacy test they have developed. For example at one workshop, the attendees had 63% correct answers at the beginning (where the wrong answers tended to cluster and indicate some important misperceptions. I need to get hold of this – it sounds like an interesting test. Kristen’s main point was that these workshops play an important role in reaching out to people of all ages, including kids, and getting them to understand how climate change will affect them.

Overall, the main message of this session was that while there have been lots of advances in our understanding of climate, these are still not being used for planning and decision-making.

First proper day of the AGU conference, and I managed to get to the (free!) breakfast for Canadian members, which was so well attended that the food ran out early. Do I read this as a great showing for Canadians at AGU, or just that we’re easily tempted with free food?

Anyway, on to the first of three poster sessions we’re involved in this week. This first poster was on TracSnap, the tool that Ainsley and Sarah worked on over the summer:

Our tracSNAP poster for the AGU meeting. Click for fullsize.

Our tracSNAP poster for the AGU meeting. Click for fullsize.

The key idea in this project is that large teams of software developers find it hard to maintain an awareness of one another’s work, and cannot easily identify the appropriate experts for different sections of the software they are building. In our observations of how large climate models are built, we noticed it’s often hard to keep up to date with what changes other people are working on, and how those changes will affect things. TracSNAP builds on previous research that attempts to visualize the social network of a large software team (e.g. who talks to whom), and relate that to couplings between code modules that team members are working on. Information about the intra-team communication patterns (e.g. emails, chat sessions, bug reports, etc) can be extracted automatically from project repositories, as can information about dependencies in the code. TracSNAP extracts data automatically from the project repository to provide answers to questions such as “Who else recently worked on the module I am about to start editing?”, and “Who else should I talk to before starting a task?”. The tool extracts hidden connections in the software by examining modules that were checked into the repository together (even though they don’t necessarily refer to each other), and offers advice on how to approach key experts by identifying intermediaries in the social network. It’s still a very early prototype, but I think it has huge potential. Ainsley is continuing to work on evaluating it on some existing climate models, to check that we can pull out of the respositories the data we think we can.

The poster session we were in, “IN11D. Management and Dissemination of Earth and Space Science Models” seemed a little disappointing as there were only three posters (a fourth poster presenter hadn’t made it to the meeting). But what we lacked in quantity, we made up in quality. Next to my poster was David Bailey‘s: “The CCSM4 Sea Ice Component and the Challenges and Rewards of Community Modeling”. I was intrigued by the second part of his title, so we got chatting about this. Supporting a broader community in climate modeling has a cost, and we talked about how university labs just cannot afford this overhead. However, it also comes with a number of benefits, particularly the existence of a group of people from different backgrounds who all take on some ownership of model development, and can come together to develop a consensus on how the model should evolve. With the CCSM, most of this happens in face to face meetings, particularly the twice-yearly user meetings. We also talked a little about the challenges of integrating the CICE sea ice model from Los Alamos with CCSM, especially given that CICE is also used in the Hadley model. Making it work in both models required some careful thinking about the interface, and hence more focus on modularity. David also mentioned people are starting to use the term kernelization as a label for the process of taking physics routines and packaging them so that they can be interchanged more easily.

Dennis Shea‘s poster, “Processing Community Model Output: An Approach to Community Accessibility” was also interesting. To tackle the problem of making output from the CCSM more accessible to the broader CCSM community, the decision was taken to standardize on netCDF for the data, and to develop and support a standard data analysis toolset, based on the NCAR Command Language. NCAR runs regular workshops on the use of these data formats and tools, as part of it’s broader community support efforts (and of course, this illustrates David’s point about universities not being able to afford to provide such support efforts).

The missing poster also looked interesting: Charles Zender from UC Irvine, comparing climate modeling practices with open source software practices. Judging from his abstract, Charles makes many of the same observations we made in our CiSE paper, so I was looking forward to comparing notes with him. Next time, I guess.

Poster sessions at these meetings are both wonderful and frustrating. Wonderful because you can wonder down aisles of posters and very quickly sample a large slice of research, and chat to the poster owners in a freeform format (which is usually much better than sitting through a talk). Frustrating because poster owners don’t stay near their posters very long (I certainly didn’t – too much to see and do!), which means you get to note an interesting piece of work, and then never manage to track down the author to chat (and if you’re like me, you also forget to write down contact details for posters you noticed. However, I did manage to make notes on two to follow up on:

  • Joe Galewsky caught my attention with an provocative title: “Integrating atmospheric and surface process models: Why software engineering is like having weasels rip your flesh”
  • I briefly caught Brendan Billingsley of NSIDC as he was taking his poster down. It caught my eye because it was a reflection on software reuse in the  Searchlight tool.

This week I’ll be blogging from the American Geosciences Union (AGU) Fall Meeting, one of the biggest scientific meetings of the year for the climate science community (although the European EGU meeting, held in the spring, rivals it for size). About 16,000 geoscientists are expected to attend. The scope of the meeting is huge, taking in anything from space and planetary science, vulcanology, seismology, all the way through to science education and geoscience informatics. Climate change crops up a lot, most notably in the sessions on atmospheric sciences and global environmental change, but also in sessions on the cryosphere, ocean sciences, biogeosciences, paleoclimatology, and hydrology. There’ll be plenty of other people blogging the meeting, a twitter feed, and a series of press conferences. The meeting clashes with COP15 in Copenhagen, but scientists see Copenhagen as a purely political event, and would much rather be at a scientific meeting like the AGU. To try to bridge the two, the AGU has organised a 24/7 climate scientist hotline aimed at journalists and other participants in Copenhagen, although more on this initiative a little later…

Today, the meeting kicked off with some pre-conference workshops, most notably, a workshop on “Re-starting the climate conversation“, aimed at exploring the sociological factors in media coverage and public understanding of climate science. I picked up lots of interesting thoughts around communication of climate science from the three speakers, but the discussion sessions were rather disappointing. Seems like everyone recognises a problem in the huge gulf between what climate scientists themselves say and do, versus how climate science is portrayed in the public discourse. But nobody (at least among the scientific community) has any constructive ideas for how to fix this problem.

The first speaker was Max Boykoff, from University of Colorado-Boulder, talking about “Mass Media and the Cultural Politics of Climate Change”. Boykoff summarized the trends in media coverage of climate change, particularly the growth of the web as a source of news. A recent Pew study shows that in 2008, the internet overtook newspapers as people’s preferred source of news, although television still dominates both. And even though climate change was one of the two controversies dominating the blogosphere in the last two weeks, it still only accounts for about 1.5% of news coverage in 2009.

Boykoff’s research focusses more on the content, and reveals a tendency in the media to conflate the many issues related to climate change into just one question: whether increasing CO2 warms the planet. In the scientific community, there is a strong convergence of agreement on this question, in contrast to say, the diversity of opinion on whether the Kyoto protocol was a success. Yet the media coverage focusses almost exclusively on the former, and diverges wildly from scientific opinion. He showed a recent clip from NBC news, which frames the whole question in terms of a debate over whether it’s happening or not, with a meteorologist, geologist and sociologist discussing this question in a panel format. Boykoff’s studies showed 53% of news stories (through 2002) diverge from the scientific consensus, and, even more dramatically, 70% of TV segments (through 2004) diverge from this consensus. In a more recent study, he showed that the ‘quality’ newspapers (in the US and UK) show no divergence, while the tabloid newspapers significantly diverge in all sources. Worse still, much of this tabloid coverage is not just on the fence, but is explicitly denialist, with a common framing to present the issue as a feud between strong personalities (e.g. Gore vs. Palin).

Some contextual factors help explain these trends, including a shrinking technical capacity (specialist training) in newspaper/TV; the tendency for extreme weather events to drive coverage, which leads to the obvious framing of “is it or isn’t it caused by climate change?”; cultural values such as trust in scientists. Carvalho and Burgess present an interesting case study on the history of how the media and popular culture have shaped one another over issues such as climate change. The key point is that the idea of an information deficit is a myth – instead the media coverage is better explained as a complex cycle of cultural pressures. One of the biggest challenges for the media is how to cover a long, unfolding story within short news cycles. Which leads to an ebb and flow such as the following: In May 2008 the Observer ran a story entitled “Surging fatal shark attacks blamed on global warming“, although the content of the article is much more nuanced: some experts attribute it to global warming, others to increased human activites in the water. But then a subsequent story in the Guardian in Feb 2009 was “Sharks go hungry as tourists stay home“. Implication: the global warming problem is going away!

The second speaker was Matthew C. Nisbet, from the American University, Washington, perhaps best known for his blog, Framing Science. Nisbet began with a depressing summary of the downward trend in American concern over climate change, and the Pew study from Jan 2009 that showed global warming ranked last among 20 issues people regarded as top priorities for congress. The traditional assumption is that if the public is opposed, or does not accept the reality, the problem is one of ignorance, and hence scientific literacy is the antidote (and hence calls for more focus on formal science education and popular science outlets – “if we only had more Carl Sagans”). Along with this also goes the assumption that the science compels action in policy debates. However, Nisbet contends that when you oversimplify a policy debate as a matter of science, you create the incentive to distort the science, which is exactly what has happened over climate change.

This traditional view of a lack of science literacy has a number of problems. It doesn’t take into account the reality of audiences and how people make up their minds – essentially there is nothing unique about the public debate over climate change compared to other issues: people rely on information shortcuts – people tend to be “cognitive misers“. And it ignores the effects of media fragmentation: in 1985, most people (in the US) got their news from four main sources, the four main TV network news services. By contrast, in 2009, there are a huge array of sources of information, and because of our inability to take advantage of such a range, people have a tendency to rely on those that match their ideological preferences. On the internet, this problem of choice is greatly magnified.

Not surprisingly, Nisbet focussed mainly on the question of framing. People look for frames of reference that help them make sense of an issue, with little cognitive effort. Frames organise the central ideas on an issue, and make it relevant to the audience, so that it can be communicated by shorthand, such as catchphrases (“climategate”), cartoons, images, etc. Nisbet has a generalized typology of ways in which science issues get framed, and points out that you cannot avoid these frames in public discourse. E.g. Bill O’Reilly on Fox starts every show with “talking points”, which are the framing references for the likeminded audience; Many scientists who blog write with a clear liberal frame of reference, reflecting a tendency among scientists to identify themselves as more liberal than conservative.

The infamous Lunz memo contains some interesting framings: “the scientific debate remains open”, the “economic burden of environmental regulation”, and “the international fairness issue” – if countries like China and India not playing along, US shouldn’t make sacrifcies. In response, many scientists and commentators (e.g. Gore) have tended to frame around the potential for catastrophe (e.g. “the climate crisis”; “Be worried“). This is all essentially a threat appeal. The problem is that if you give an audience a threat, but no information on how to counter it, they either become fatalist or ignore it, and also, it opens the door to the counter-framing of calling people “alarmists”. This also plays into a wider narrative about the “liberal media” trying to take control.

Another framing is around the issue of public accountability and scientific evidence. For example, Mooney’s book “The Republican War on Science” itself became a framing device for liberals, which led to Obama’s “must restore science to its rightful place”. However, this framing can reinforce the signal that science is for democrats, not for republicans. Finally, “climategate” itself is a framing devices that flips the public accountability frame to one of accountability of scientists themselves – questioning their motivations. This has also been successfully coupled to the media bias frame, allowing the claim that the liberal media is not covering the alternative view.

So how do we overcome this framing? Nisbet concluded with examples of framings that reach out to difference kinds of audience. For example: EO Wilson’s book “The creation“, frames it as a religious/moral duty, specifically as a letter to a southern baptist. This framing helps to engage with evangelical audiences. Friedman frames it as a matter of economic growth – we need a price on carbon to stimulate innovation in a second American industrial revolution. Gore’s “We” campaign has been rebranded as “Repower America“. And the US Congress no longer refers to a “cap and trade bill”, but the American Clean Energy and Security Act (ACES).

Nisbet thinks that a powerful new frame is to talk about climate change as a matter of public health. The strategy is to shift the perception away from issues of remote regions (e.g. ice caps and polar bears) to focus instead on the impact in urban areas, especially on minorities, the elderly and children. Frame in terms of allergies, heatstress, etc. The lead author of the recent Lancet study, Anthony Costello, said that public health issues are under-reported, and need attention because they affect billions of people.

The recent Maibach and Leiserowitz study identifies six distinct audience groups, and changes ideas about where public perceptions are, especially the idea that the public is not concerned about climate change:

Figure 1

As an experiment, Nisbet and colleagues set up a tent on the National Mall in Washington, interviewed people who said they were from outside of DC, and categorized them in one of the six audiences. They then used this to identify a sample of each group to be invited to a focus group session, where they could test out the public health framing for climate change. Sentence by sentence analysis of their responses to a short essay on health issues and climate change proved very interesting, as there are specific sections, especially towards the end about policy options, where even the dismissives had positive responses. For example:

  • all six groups agreed “good health is a great blessing.
  • all six groups agreed to suggestions around making cities cleaner, easier to get around, etc.
  • 4 of the 6 groups found it helpful to learn about health threats of CC (which corresponds to a big majority of American audience)
  • All 6 groups reacted negatively to suggestions that we should make changes in diet and food choices, such as increasing vegetable and fruit and cutting down meat consumption. Hence, this is not a good topic to lead on!

In the question-and-answer session after Nisbet’s talk, there was an interesting debate about the lack of discussion in the media about how the science was obtained, concentrating on the results as if they came from a black box. Nisbet pointed out that this might not be helpful, and cited the fact that the climategate emails surprised many people with as how human scientists are (with egos and political goals) and the level of their uncertainty about the specific data analysis questions, which has made it a very successful framing device for those promoting the message that the science is weak. However, several audience members contended that this just means we need to do a better job of getting people to think like scientists, and bring the broader society more into the scientific process. While others pointed out how hard it is to get journalists to write about the scientific method, so the idea of partnering with others (e.g. religious leaders, health professionals) makes sense if it helps to identify and motivate particular audiences. This still leaves open the question of how to communicate uncertainty. For example, in the public health framing, people will still want to know does it affect, say, asthma or not. And as we’re still uncertain on this, it leads to the same problem as the shark attack story. So we still have to face the problem of how people understand (or not!) the scientific process.

The final speaker was Gwendolyn Blue, from the University of Calgary, Canada, talking about “Public Engagement with Climate Change”. Blue contends that major issues of science and technology require a participatory democracy that does not really exist yet (although is starting to appear). A portion of the lay public is distrustful of scientific institutions, so we need more effective ways of engaging the more diverse groups in the conversation.

Blue defines public engagement as “a diverse set of activities whereby non-experts become involved in agenda setting, decision-making, policy forming and knowledge production processes regarding science”. The aim is to overcome the traditional “one-way” transmission model of science communication, which tend to position lay audience as passive, and hence obsesses over whether they are getting good or bad information. But public understandings of science are much more sophisticated than many people give credit for, particularly away from questions of basic facts, such as the ethical and social implication of scientific findings.

There are clearly a number of degrees of citizen participation, for example Arnstein’s ‘ladder’. Blue is particularly interested in the upper rungs – i.e. not just ‘informing’ (which movements such as cafe scientifique, and public lectures try to do) but through engagements that aim  to empower and transform (e.g. citizen science, activism, protests, boycotts, buycotts). But she doesn’t think citizen participation in science is a virtue in it’s own right, as it can be difficult, frustrating, and can fail: success is highly context dependent.

Examples include: 350.org, which uses social networking media to bring people together. Lots of people creating images of the number 350 and uploading their efforts (but how many participants actually understand what the number 350 represents?); tcktcktck which collected 10 million signatures and delivered them to the COP15 negotiators. And the global day of action yesterday, which was one of the biggest demonstrations of public activism. However, this type of activism tends to be bound up with social identity politics.

Deliberative events aim to overcome this by bringing together people who don’t necessarily share the same background / assumptions. Blue described her experiences with one particular iniative, the World Wide Views on Global Warming events on September 26, 2009. The idea grew out of the Danish model of consensus politics. Randomly selected participants in each country, were offered free trip to a workshop (in Canada, it took place in Calgary), with some effort to select approximately 100 people representing the demographic makeup of each country. The aim was to discuss the policy context for COP15. There were (deliberately) no “experts” in the room, to remove the inequality of experts vs. lay audience. Instead, a background document was circulated in advance, along with a short video. Clear ground rules were set for good dialogue, with a trained facilitator for each table. Used cultural activities too: e.g. at the Canadian event, they used music and dance from across Canada, Inuit throat singers, opening prayer by Blackfoot Elder, etc.

The result was a fascinating attempt to build an engaged public conversation around climate change and the decision making process we face. A number of interesting themes emerged. For example, despite a very diverse set of participants, lots of common ground emerged, which surprised many participants, especially around scale/urgency of the problem, and the overall goals of society. A lot of social learning took place – many participants knew very little about climate science at the outset. However, Blue did note that success for these events requires scientific literacy and well as civil literacy in the participants, reflexivity, humility and willingness to learn. But it is part of a broader cultural shift towards understanding the potential and limitations of participatory democracy.

The results were reported in a policy report, but also, more interestingly, on a website that allows you to compare results from different countries. Much of the workshops were about public deliberation, which can be very unruly. But at the end this was distilled down to an opinion poll with simple multiple choice questions to communicate the results.

The discussion towards the end of the workshop focussed on how much researchers should be involved in outreach activities. It is not obvious who should be doing this work. There is no motivation for scientists to do it, and lots of negatives – we get into trouble, and our institutions don’t like it; it doesn’t do anything for your career in most cases. And several of the speakers at the workshop described strategies in which there doesn’t seem to be a role for climate scientists themselves. Instead, the work seems to point the need for a new kind of “outreach professional” who is both scientific expert and trained in outreach activities.

Which brings me back to that experiment I mentioned at the top of the post, on the AGU providing a 24 hour hotline for COP15 participants to talk directly with small groups of climate scientists. It turns out the idea has been a bit of a failure, and has been scaled back due to a lack of demand. Perhaps this has something to do with the narrow terms that were set for what kinds of questions people could ask of the scientists. The basic science is not what matters in Copenhagen. Which means the distance between Copenhagen and San Francisco this week is even greater than I thought it would be.

[Update: Michael Tobis has a much more critical account of this workshop]

As a follow-on from yesterday’s post on making climate software open source, I’d like to pick up on the oft-repeated slogan “Many eyeballs make all bugs shallow”. This is sometimes referred to as Linus’ Law (after Linus Torvalds, creator of Linux), although this phrase is actually attributed to Eric Raymond (Torvalds would prefer “Linus’s Law” to be something completely different). Judging from the number of times this slogan is repeated in the blogosphere, there must be lots of very credulous people out there. (Where are the real skeptics when you need them?)

Robert Glass tears this one apart as a myth in his book “Facts and Fallacies about Software Engineering“, on the basis of three points: it’s self-evidently not true (the depth of a bug has nothing to do with how many people are looking for it); there’s plenty of empirical evidence that the utility of adding additional reviewers to a review team tails off very quickly after around 3-4 reviewers; and finally there is no empirical evidence that open source software is less buggy than its alternatives.

More interestingly, companies like Coverity, who specialize in static analysis tools, love to run their tools over open source software and boast about the number of bugs they find (it shows off what their tools can do). For example, their 2009 study found 38,453 bugs in 60 million lines of source code (a bug density of about 0.64 defect/KLOC). Quite clearly, there are many types of bugs that you need automated tools to find, no matter how many eyeballs have looked at the code.

Part of the problem is that the “many eyeballs” part isn’t actually true anyway. In a study conducted by Xu et. al. in 2005 of the sourceforge community, they found that participation in projects follows the power law well known in social network theory: a few open source projects have a very large number of participants, and a very large number have very few participants. Similarly, a very small number of open source developers participate in lots of projects; the majority participate in just one or two:

SourceForge Project and Developer Community Scale Free Degree Distributions (Figure 7d from Xu et al 2005)

SourceForge Project and Developer Community Scale Free Degree Distributions (Figure 7d from Xu et al 2005)

For example, the data shown in these graphs include all developers and active users for about 160,000 sourceforge projects. Of these projects, 25% had only a single person involved (as either developer or user!), and a further 10% had only 2-3 people involved. Clearly, a significant number of open source projects never manage to build a community of any size.

This is relevant to the climate science community because many of the tens of thousands of scientists actively pursing research relevant to our understanding of climate change build software. If all of them release their software as open source, there’s no reason to expect a different distribution from the graphs above. So most of this software will never attract any participants outside the handful of scientists who wrote it, because there simply aren’t enough eyeballs or interest available. The kind of software described in the famous “Harry” files at the CRU is exactly of this nature – if it hadn’t been picked out in the stolen CRU emails, nobody other than “Harry” would ever take the time to look at it. And even if lots of people’s attention was drawn to this particular software (as it has been), there are still thousands of other scraps of similar software out there which would also remain single person projects like those on sourceforge. In contrast, a very small number of projects will attract hundreds of developers/users.

The thing is, this is exactly how the climate science community operates already. A small number of projects (like the big GCMs, listed here) already have a large number of developers and users – for example, CCSM and Hadley’s UM have hundreds of active developers, and a very mature review process. Meanwhile a very large number of custom data analysis tools are built by a single person for his/her own use. Declaring all of these projects to be open source will not magically bring “many eyeballs” to bear on them. And indeed, as Cameron Neylon argues, those that do will immediately have to protect themselves from a large number of clueless newbies by doing exactly what many successful open source projects do: the inner clique closes ranks and refuses to deal with outsiders, ignores questions on the mailing lists, etc. Isn’t that supposed to be the problem we were trying to solve?

The argument that making climate software open source will somehow magically make it higher quality is therefore specious. The big climate models already have many eyeballs, and the small data handling tools will never attract large numbers of eyeballs. So, if any of the people screaming about openness are truly interested in improving software quality, they’ll argue for something that is actually likely to make a difference.

Well, this is what it comes down to. Code reviews on national TV. Who would have thought it? And, by the standards of a Newsnight code review, the code in question doesn’t look so good. Well, it’s not surprising it doesn’t. It’s the work of one, untrained programmer, working in an academic environment, trying to reconstruct someone else’s data analysis. And given the way in which the CRU files were stolen, we can be pretty sure this is not a random sample of code from the CRU; it’s handpicked to be one of the worst examples.

Watch the clip from about 2:00. They compare the code with some NASA code, although we’re not told what exactly. Well, duh. If you compare the experimental code written by one scientist on his own, which has clearly not been through any code review, with that produced by a NASA’s engineering processes, of course it looks messy. For any programmers reading this: How many of you can honestly say that you’d come out looking good if I trawled through your files, picked the worst piece of code lying around in there, and reviewed it on national TV? And the “software engineer” on the program says it’s “below the standards you would expect in any commercial software”. Well, I’ve seen a lot of commercial software. It’s a mix of good, bad, and ugly. If you’re deliberate with your sampling technique, you can find a lot worse out there.

Does any of this matter? Well, a number of things bug me about how this is being presented in the media and blogosphere:

  • The first, obviously, is the ridiculous conclusion that many people seem to be making that poor code quality in one, deliberately selected program file somehow invalidates all of climate science. As cdavid points out towards the end of this discussion, if you’re going to do that, then you pretty much have to throw out most results in every field of science over the past few decades for the same reason. Bad code is endemic in science.
  • The slightly more nuanced, but equally specious, conclusion that bugs in this code mean that research results at the CRU must be wrong. Eric Raymond picks out an example he calls blatant data-cooking, but is quite clearly fishing for results, because he ignores the fact that the correction he picks on is never used in the code, except in parts that are commented out. He’s quote mining for effect, and given Raymond’s political views, it’s not surprising. Just for fun, someone quote mined Raymond’s own code, and was horrified at what he found. Clearly we have to avoid all open source code immediately because of this…? The problem, of course, is that none of these quote miners have gone to the trouble to establish what this particular code is, why it was written, and what it was used for.
  • The widely repeated assertion that this just proves that scientific software must be made open source, so that a broader community of people can review it and improve it.

It’s this last point that bothers me most, because at first sight, it seems very reasonable. But actually, it’s a red herring. To understand why, we need to pick apart two different arguments:

  1. An argument that when a paper is published, all of the code and data on which it is based should be released so that other scientists (who have the appropriate background) can re-run it and validate the results. In fields with complex, messy datasets, this is exceedingly hard, but might be achievable with good tools. The complete toolset needed to do this does not exist today, so just calling for making the code open source is pointless. Much climate code is already open source, but that doesn’t mean anyone in another lab can repeat a run and check the results. The problems of reproducibility have very little to do with whether the code is open – the key problem is to capture the entire scientific workflow and all data provenance. This is very much an active line of research, and we have a long way to go. In the absence of this, we rely on other scientists testing the results with other methods, rather than repeating the same tests. Which is the way it’s done in most branches of science.
  2. An argument that there is a big community of open source programmers out there who could help. This is based on a fundamental misconception about why open source software development works. It matters how the community is organised, and how contributions to the code are controlled by a small group of experts. It matters that it works as a meritocracy, where programmers need to prove their ability before they are accepted into the inner developer group. And most of all, it matters that the developers are the domain experts. For example, the developers who built the Linux kernel are world-class experts on operating systems and computer architecture. Quite often they don’t realize just how high their level of expertise is, because they hang out with others who also have the same level of expertise. Likewise, it takes years of training to understand the dynamics of atmospheric physics in order to be able to contribute to the development of a climate simulation model. There is not a big pool of people with the appropriate expertise to contribute to open source climate model development, and nor is there ever likely to be, unless we expand our PhD programs in climatology dramatically (I’m sure the nay-sayers would like that!).

We do know that most of the heavy duty climate models are built at large government research centres, rather than at universities. Dave Randall explains why this is: the operational overhead of developing, testing and maintaining a Global Climate Model is far too high for university-based researchers. The Universities use (parts of) the models, and do further data analysis on both observational data and outputs from the big models. Much of this is the work of indivdual PhD students or postdocs. Which means that the argument that all code written at all stages of climate research must meet some gold standard of code quality is about as sensible as saying no programmer should ever be allowed to throw together a script to test out if some idea works. Of course bad code will get written in a hurry. What matters is that as a particular line of research matures, the coding practices associated with it should mature too. And we have plenty of evidence that this is true of climate science: the software practices used at the Hadley Centre for their climate models are better than most commercial software practices. Furthermore, they manage to produce code that appears to be less buggy than just about any other code anywhere (although we’re still trying to validate this result, and understand what it means).

None of this excuses bad code written by scientists. But the sensible response to this problem is to figure out how to train scientists to be better programmers, rather than argue that some community of programmers other than scientists can take on the job instead. The idea of open source climate software is great, but it won’t magically make the code better.

Justyna sent me a pointer to another group of people exploring an interesting challenge for computing and software technology: The Crisis Mappers Net. I think I can characterize this as another form of collective intelligence, harnessed to mobile networks and visual analytics, to provide rapid response to humanitarian emergencies. And of course, after listening to George Monbiot in the debate last night, I’m convinced that over the coming decades, the crises to be tackled will increasingly be climate related (forest fires, floods, droughts, extreme weather events, etc).

This evening I’m attending a debate on climate change in Toronto, at the Munk Centre. George Monbiot (UK journalist) and Elizabeth May (leader of the Canadian Green party) are debating Bjorn Lomborg (Danish economist) and Nigel Lawson (ex UK finance minister) on the resolution “Be it resolved climate change is mankind’s defining crisis and demands a commensurate response“. I’m planning to liveblog all evening. Feel free to use the comments thread to play along at home. Update: I’ve added a few more links, tidied up the writing, and put my meta-comments in square bracket. Oh, and the result of the vote is now at the end.

Monbiot has long been a critic of the debate format for discussing climate change, because it allow denialists (who only have to sow doubt) to engage in the gish gallop, which forces anyone who cares about the truth to engage in a hopeless game of whack-a-mole. The was an interesting story over the summer on how Ian Plimer (Australia’s most famous denialist) challenged Monbiot to a debate. Monbiot insisted on written answers to some questions about Plimer’s book as a precondition to the debate, to ensure the debate would be grounded. Plimer managed to ignore the request, and then claim Monbiot had chickened out. Anyway, Monbiot has now decided to come to Canada and break his no fly rule, because he now sees Canada as the biggest stumbling block to international progress on climate change.

Lomborg, of course, got famous as the author of the Skeptical Environmentalist. His position is that climate change is real, but much less of a problem than many other pressing issues, particularly third world development. He therefore opposes any substantial action on climate change.

May is leader of the Canadian Green Party, which regularly polls 5-6%  of the popular vote in federal elections, but has never had an MP elected in Canada’s first past the post system. She is also co-author of Global Warming for Dummies.

Lawson is a washed up UK Tory politician. He was once chancellor of the exchequer (=finance minister) under Margaret Thatcher (and energy minister prior to that), where he was responsible for the “Lawson boom” in the late 1980’s, which, being completely unsustainable, led to a an economic crash in the UK. Lawson resigned in disgrace, and Thatcher was later forced out of office by her backbenchers. [personal note: I was in debt for many years as a result of this, due to money we lost on our apartment, bought at the peak of the boom. I’m still sore. Can you tell?]

I think they’re about to start. To make this easier, and to attempt to diagnose any attempt at the gish gallop, I’ll use the numbers from Skeptical Science whenever I hear a long debunked denialist talking point. By the way, there’s a live feed if you’re interested.

First up. Peter Munk is introducing the event. He’s pointing out that the four debaters are the “rock stars” of climate change, and they have travelled from all over the world to the “little town” of Toronto. [Dunno about that. There are no scientists among them. Surely the science matters here?]

Oh cool, I just discovered Dave Roberts is liveblogging too.

At the beginning, 61% of the audience of 1100 people support the proposition, but 79% of the audience said they could potentially change their mind over the course of the debate. Seven minutes each for opening statements.

First speaker is Nigel Lawson. He agrees it’s an important issue, and is seldom properly debated. He claims it’s a religion and that people like Gore will not debate, and will not tolerate dissent. He’s separated the issue from environmentalism, and framed it as a policy question. He claims that most climate scientists don’t even support the proposition. He cites a survey in which just 8% of scientists said that global warming was the most important issue facing humanity [is this an attempt to invoke SS3? Maybe not – see comments]. [Oh SS8!]. And he’s called for an enquiry into the CRU affair. Okay, now he’s picking apart the IPCC report. Now he’s trying to claim that economically, global warming doesn’t matter, even at the upper end of the IPCC’s temperature anomaly forecast. And now he’s onto the Lomborg argument that fastest possible global economic growth is needed to lift the third world out of poverty, which must be based on the cheapest form of energy [by which he presumably means the dirtiest]. And he’s also arguing that mankind will always adapt to changing climate.

Okay, he’s run out of time. Summary: he thinks the proposition is scientifically unfounded and morally wrong.

Next up: Elizabeth May. The clock run over on Lawson’s time, and the moderator has credited the time to May, so she kicked off with a good joke about an innovative use of cap-and-trade. She is grieved that in the year 2009 we’re still asking whether we should act, and whether this is the defining threat. She says we should have been talking tonight about how to reach the targets that have been set for us by the scientific community, not whether we should do it [good point, except that the proposition is about “mankind’s defining crisis”, not whether we should tackle climate change]. She’s covering some of the history, including the 1988 Toronto conference on climate change, and it’s conclusions that the threat of climate change was second only to global nuclear war. And now a dig at Lawson, who served under prime minister Margaret Thatcher, who fully understood 19 years ago that the science was clear. We know we have changed the chemistry of the atmosphere. This year 30% more CO2 in the atmosphere than any time in the last few million years [this is great – she’s summarizing the scientific evidence!]. She’s pointing out the CRU emails are irrelevant, there was no dishonesty, just decent scientists being harrassed. And anyway, their work is only one small strand of the scientific work. Quick summary of the indicators: melting glaciers, melting polar ice, sea level rise 80% faster than the last IPCC projection. Since the Kyoto protocol, political will has evaporated. Now we’ve run down the clock, and there’s very little time to act. [big applause! The audience likes her].

Next up: Bjorn Lomborg: Human nature is a funny thing – we’re not able to take anything seriously unless it’s hyped up to be the worst thing ever. He’s claiming the “defining crisis” framing is so completely over the top that it only provokes extremist positions in response. So he thinks polarization is not helpful. He’s listing numbers of people living without food, shelter, water, medicines to cure diseases. Hence, he can’t accept that global warming cannot compare to these pressing issues of today. Oh, he’s an eloquent speaker. He’s thinks we lack the political will because we’re barking up the wrong tree. [False dichotomy about to appear…] we cannot focus on third world poverty if we focus on climate change. He thinks the cure for climate change is much more expensive than the problem, and that’s why the political will is missing. He thinks solar panels are not going to matter today, because only once they are cheap enough for everyone to put them up, then everyone will just put them up anyway. Hence we need lots more research and development instead, not urgent policy changes. So we need to stop talking about “cuts, cuts, cuts” in emissions, and find a different approach. So he says global warming is definitely a problem, but it’s not the most important thing, and while we will have to solve it this century, we should be more smart about how we fix it. And he summarizes by saying there are many other challenges that are just as important.

Monbiot: Hidden in the statement is a question: how lucky do you feel. Lawson and Lomborg obviously feel lucky because they don’t think we should prepare for the worst case, and what they are advocating does not even address the most optimistic scenario of the IPCC. He’s making fun of Lawson, who he says has single handedly rumbled those scientists and caught them at it: whereas NL says warming has stopped, the scientists instead show that 8 out of the last 10 years are the warmest in recorded history. Now he’s showing a blank sheet of paper saying it’s the sum total of Lawson’s research on the science! And he’s demolishing the economic arguments of Lawson and Lomborg and countering with the extensive research of the Stern report, who found that the cost of fixing the problem amounted to 1% of GDP, while the costs amounted to 5% to 20% of GDP [pity he didn’t use my favourite quote from Stern: climate change is the greatest and widest-ranging market failure ever]. So who do you believe: Stern’s extensive research or Lawson’ belief in luck? And he’s attacking the argument that we can adapt. And especially in the poorer parts of the world – he’s pointing out that these people suffer the most from the effects (he’s familiar with impact of the climate change droughts in the horn of Africa where he worked for a while). Best adaptation technology there is the AK47: when the drought hits, the killing begins. [Bloody hell, he’s good at this stuff]. Now he’s pointing out Lomborg’s false dichotomy – money for fixing climate change doesn’t have to come out of foreign aid budgets. The answer to the question of whether we should invest in fixing climate change or in development, the answer is ‘yes’ [laughter]. So this is not a time for cheap political shots [much laughter, as he acknowledges that’s what he’s been doing], then says “but we can do that because we’re on the side of the angels!”. [Hmmm. Not sure about that line – it’s a bit arrogant].

Okay, now the rebuttals [although I think Monbiot’s opening statement was already a great rebuttal].

Lawson first: he disagrees with everything the other side said. He says Monbiot is incapable of distinguishing a level from a trend. If a population grew and then stopped growing you could still say the population is at the highest level it has ever been. [and so the inference is that we’re currently at the peak of a warming trend. That’s a really stupid thing for an economist to argue.] He’s trying to claim that most scientists admit there has been no warming and (oh, surely not) he’s using the CRU emails to back this up – they are embarrassed they can’t explain the lack of warming. [pity Lawson doesn’t know that the quote is about something quite different!]. He’s trying to discredit Stern now by citing other economists who disagree. He’s claiming Stern was asked to write a report to support existing government policies (and isn’t even peer-reviewed).

May up next: In terms of years on record, she’s pointing out that year-on-year comparisons are not relevant, and scientists know this. We’re dealing with very large systems. Oh, and she’s pointed out the problem with ocean acidification, which she says Lomborg and Lawson ignore. She’s looked at the CRU emails, and she’s read them all. She’s pointing out that, like Watergate, what was stolen was irrelevant, what matters here is who stole them and why. She’s citing the differences between the Hadley data and NASA data, and pointing out that the scientists are speculating about the differences, and that they are due to lower coverage of the arctic in the Hadley data set. She’s quoting from specific emails with references, and she’s checked with the IPCC scientists (including U of T’s own Dick Peltier) and they have no doubt about the trend decade-on-decade. [Nice to see local researchers getting a mention for the local U of T audience – it’s a great stump speech tactic]

Monbiot’s up again, and the moderator has asked him to address the issues about the Stern report. Monbiot points out that having accused the scientists of fraud, Lawson is now implying that the UK government was trying to commit suicide, if it’s true (as Lawson asserts) they had demanded the results Stern offered. Far from confirming the government’s position, it put the wind up the government. And he’s trying to drive a knife between L and L, by pointing out they have different positions on whether there has been warming this century.

Lomborg: Claims that the Stern report is an extremist view among economists. And that the UK government had approached two other economists before Stern but didn’t get the answer they wanted. And he’s trying to claim that because Stern’s work was a review of the science rather than original research, that makes it less credible [huh?? That’s got to be a candidate for stupidest claim of the evening]. So now he’s saying that on the one hand Monbiot would have it that thousands of scientists support the IPCC reports, but ignores the fact that thousands of climate economists disagree with Stern’s numbers.

Now the moderator is back to Lawson, and asking about the insurance issue: Doesn’t it make sense to insure ourselves against the worst case scenario? Lawson says this is not really like insurance, because it’s not compensation we’d be after  [okay, good point]. He says it’s like proposing to spend more money on fireproofing the house than the house is worth [wait, what?? He thinks the world is worth less than the cost of mitigating climate change??]. Clearly Lawson thinks the cost-benefit trade-off isn’t worth it.

May: a lot of people in the developing world are extremely concerned about the effects of climate change. Oh great dig at Bjorn, she’s pointing out that Bjorn only argues against climate change action, but doesn’t ever argue in favour of development spending in the third world, and therefore is a huge hypocrite [yes, she called him that to his face]. And the climate crisis is making AIDS worse in Africa every day. Lomborg is tring to butt in, but the moderator is calling for civility – he’s given them a time-out!!! May isn’t accepting it! [Well, that was exciting!].

Monbiot: we spend very little on foreign aid, and would like to see us spend more. And points out that climate change makes things worse, because the drought causes men to leave the land, and move to the cities, where they meet more prostitutes, and then bring AIDS back to their families (this is according to Oxfam). Just to maintain global energy supplies (from fossil fuels), between now and 2030, we need to spend $25 trillion US dollars. And the transfer to the oil rich nations in the process will be $30 trillion. So it isn’t a case of whether or not we spend money on fighting climate change. It’s a question of what investments we will make in which forms of energy in the future. And he’s pointed out that peak oil might mean we simply cannot carry on depending on fossil fuels anyway.

Lawson again. The moderator asked him to comment on whether there are beneficial effects of investing in alternative energy – He briefly admits there might but then ignores the question. He’s trying to debunk peak oil by pointing out that when he was energy secretary in the 80’s experts told him we only had 40 years of oil supplies left, and they still say that today, and they always say that. And the thinks global agreement will never happen anyway, because China will never agree to move to more expensive energy sources, and is busy buying up oil supplies from all the surrounding countries.

The moderator has cut off discussion of peak oil, and wants to talk about what’s the tipping point about CO2 concentrations in the atmosphere: 450ppm? 500ppm? Lomborg first. He accepts we’re going to see a continued rise in concentrations. He’s pointing out the difference between intensity cuts and real cuts [sure, but where is this going? Oh, I see…] he claims that because China realised they would get the intensity cuts of 40% anyway just by efficiency gains, they could claim to have set agressive targets and will make them by doing nothing different, but everyone then applauds them for making progress on emissions [I’m still not sure of the point – I don’t remember anyone heaping praise on China for emissions progress, given how many new coal-fired power stations they are building]. He’s now saying that if Monbiot says we should also spend on development then he’s moving over to their side, because climate change is no longer the defining crisis, it’s just one of many. He’s pointing out that fighting climate change is a poor way to fight AIDS, compared to say, handing out condoms.

May: points out that Lomborg puts forward strawman arguments and false choices. The people arguing for action on climate change *are* the people calling for more efficient technology, new alternative energy sources, etc. Whereas Lomborg would put this up as a false dichotomy. She’s pointing out that many of the actions actually have negative cost, especially changes that come from efficiency gains. In Canada, we waste more energy than we use. She says the problem with debating Lomborg is that he quotes some economists and then ignores what else they say (and she’s waving Lomborg’s book around and quoting directly out of it).

Monbiot again, and the moderator is asking about potential benefits from rising temperatures, e.g. for Canada. GM says in the IPCC report, beyond 3ºC of warming, we have a “net decrease in global food production”. Behind these innocent sounding words in the IPCC report is a frightning problem. 800 million people already go hungry. If there’s a net decrease in food production, it’s saying we are moving to structural famine situation, which makes all the other issues look like sideshows in the circus of human suffering [Nice point!]. So, let’s not make false choices, we need to deal with all these issues, but if we don’t tackle climate change, it makes all other development issues far worse. In Africa, 2°C of warming is catastrophic, and these are the people who are not responsible in any way for climate change. The cost for them isn’t in dollars, it’s in human lives, and you can’t put a price on that. You can’t put that in your cost-benefit analysis. Human life must come first.

Lomborg: we all care about other species on the planet and about human life. But he’s arguing that we can save species more effectively by making more countries rich so they don’t have to cut down their forests [I’m hopping up and down at the stupidity of this! Why does he think the Amazon rainforest is disappearing?!], rather than by fighting climate change. Okay, now he’s arguing that cutting emissions is futile because it will make very little difference to the warming that we experience. So it’s better not to do it, and go for fast economic growth instead. He’s claiming that economic development is much more effective than emissions reduction [again with these false dichotomies!!]. He claiming that each dollar spent on climate change mitigation would save more lives if spend directly on development.

Okay, now he’s going to the audience for questions. Or apparently not: Lawson wants to say something: The great killer is poverty. Whereas economic aid helps a little bit, what really helps is economic development. He’s arguing that forcing people to rely on more expensive energy slows down development. Now he’s arguing with Monbiot’s point about a net reduction in food production after 3ºC rise. He’s saying that food production will rise up to 3°C, and after that will still be higher than today, but will not rise further [This is utter bollocks. He’s misunderstood the summary for policymakers, and failed to look at the graphs on page 286 of AR4 WG2]. He says the IPCC also says, on the topic of health, that the only health outcome that they IPCC regards as virtually certain is the reduction in death from cold exposure [Oh, stupid, stupid stupid. He’s claiming that the certainty factors are more important than the number of different types of impact. How does he think he can get away with this crap?].

Monbiot again, and the moderator is asking what’s the best way to lift people out of poverty. Monbiot points out that in Africa it’s much cheaper to build solar panels than to build an energy infrastructure based on bringing in oil. You can help people to escape from poverty without having to mine fossil fuels, and thereby threaten the very lives we’re trying to protect. And now, he’s citing the actual table in the IPCC report to prove Lawson wrong. He’s pointing out to an economist, every thing is flexible, if you want more food you just change the price signals. But if the rains stop, you can’t get more food just by the changing price signals, because nature doesn’t pay any attention to the economy. E.g. a recent Hadley study showed that 2.1 million extra people will be subjected to water stress at 2°C rise, and these people can’t be magic’d away by fiddling with a spreadsheet. Climate change isn’t about the kinds of choices that L&L are suggesting.

And the moderator is inviting May to add and last comments before the wrap up. She says the problem with this discussion is that we haven’t established the context for why action is so urgent. The climate crisis is putting in place some fundemental new processes (in earth systems), and the question is when can we stabilize carbon concentrations so that the temperature rise stops, giving us a chance to adapt (and she thinks adaption is just as important). Only one of the issues we face on the planet today moves in an accelerating fashion, unleashing positive feedback effects – e.g. releasing methane from the melting permafrost, the impact on pine forests by increasing insect activity, releasing more as they decay. The decreased albedo when the polar ice melts. Good point: she points out the work of Stephen Lewis who has done far more than Lomborg to address poverty, and he agrees that climate change is an urgent issue.

Now, final wrap up, 4 minutes each, opposite order to the opening remarks:

Monbiot: He’s concerned about climate change because of his experience in Kenya. In 1992, when he was there, they were suffering their worst drought to date. They had run out of basic resources, and the only thing they could do was raid neighbouring tribes for resources. Mobiot was supposed to visit a cattle camp, but collapse with malaria and was taken off to hospital, and it was the luckiest thing in his life, because when he finally made it to visit the place a few weeks later, the cattle camp he was supposed to have visited had been totally destroyed – all that was left of the 93 people who lived there were their skulls – shot in the night by raiders who were desperate because of the drought, which was almost certainly due to climate change. This is what it’s really about – not spreadsheets and figures, but life and death. This is what switched Monbiot on to climate change. All our work on fighting for social justice and fighting poverty will have been in vane if we don’t stop climate change. All the development agencies – Oxfam, etc, who are on the front line of this, are telling us that climate change is mankind’s defining crisis.

Lomborg: Nobody doubts that everyone here has their heart in the right place. However, he’s arguing that it’s not clear they are suffering because of global warming. Rather than reducing drought by some small percentage by the end of the century, we should make sure they get development now. He’s arguing against Monbiot’s water stress numbers. He’s claiming that “George and Elizabeth” moved over to his side (they’re violently shaking their heads!). He’s claiming that when Elizabeth supports investment in clean energy, she’s come over to his side! His core argument is that the best we can do is postpone global warming by six hours at the end of the century [This is truly a bizarre claim. Where does he get this from?]. So how do we want to be remembered by our kids: for spending trillion dollars on something that was not effective, or working on economic development now.

May: We’ve seen lots of theatre this evening, but the issues are serious. She says Lomborg plays with numbers and figures in a way she finds deplorable. The scientists have solid science that compelled people like Brian Mulroney and Margaret Thatcher to call for action. And somehow we’ve lost that momentum. She’s pointing out the flaw in Lomborg’s argument about water – if the average amount of water is the same, that’s no good if it’s an average over periods of drought and deluge. She’s raised ocean acidification again: how will we feed the world’s people if we kill off life in the ocean? She’s talking about the GRACE project (Dick Peltier gets a mention again) monitoring the West Antarctic ice sheet, and how it is melting now. If it melts, we get a nine metre sea level rise, and no economist can calculate the cost of that. And she’s giving a nice extended analogy about how if the theatre really is on fire, you don’t listen to people trying to reassure everyone and tell them to stay in their seats.

Lawson: Why aren’t scientists pleased there hasn’t been warming over the last few years? [SS4 again. How does he think he’ll get away with this?] they’re upset about it rather than being pleased! [CRU misinterpretation again]. Again, on the water issue – if you get cycles of drought and deluge you capture the water, and solve the real problem rather than the climate change problem [Oh, this is just stupid. You patch the effects rather than tackling the cause??]. He’s now saying that May and Monbiot have the best rhetoric, but there’s a gap between politicians’ rhetoric and the reality. And in all his career he’s never seen such a gap between politician’s rhetoric and what they are doing (as on the topic of climate change). And he claims it is because the cost is so great there’s no way they can go along with what the rhetoric says [Oh, surely this is an own goal? The gap is so big exactly because this is a ‘defining’ crisis!]. He doesn’t believe in rhetoric, he believes in reason [LOL], working out what it is sensible do do.

Moderator: it’s one thing to give a set speech, and quite another to come onto a stage and confront one another views in this type of forum. He’s calling for a vote from the audience. Pre-debate, 61% supported the motion. They will collect the results on the way out and announce them shortly after 9pm tonight. And now he’s invited the audience to move to the reception. Okay, I guess that’s it for now.

Update: The results show that some people were swayed against the proposition: still a majority in favour, but now down to 56% after the debate, with 1050 votes cast.

Okay, time for some quick reflections. Liveblogging debates is much harder than liveblogging scientific talks – no powerpoints, and they go much much faster. I’m typing so fast I can’t reflect, but at least it means I’m focussing on what they’re saying rather than drifting off on tangental thoughts…

I think the framing of the debate was all wrong in hindsight. The proposition that it’s “mankind’s defining crisis” allows Lomborg to say that there’s all this other development stuff that’s important too (although May managed to call him a hypocrite on that, rightly so), and then get Monbiot and May to say that of course they support development spending in the poorer parts of the world as well, which then lets Lomborg come back with the rejoinder that the proposition must be wrong because even Monbiot and May agree we have many different major problems to solve. Of course, this is all a rhetorical trick, which would allow him to claim he won the debate – he even tried twice to claim M & M had moved over to his side. Meanwhile in the real world all these rhetorical tricks make no difference because the science hasn’t changed, and the climate change problem still has the capacity to spiral out of control and, as Monbiot points out, swamp all our other problems. And there’s Lawson at the end claiming he doesn’t believe in rhetoric, he believes in reason, all the while misquoting and misrepresenting the science. I actually think Lawson was an embarrassment, while Lomborg was pretty effective – I can see why lots of people who don’t know the science that well are taken in by his arguments.

Ultimately I’m disappointed there was so little science. May did a great job summarizing a lot of the science issues, but everyone else just ignored them. I doubt this debate changed anyone’s minds. And the conclusion: A majority of the audience agreed with the proposition that climate change is mankind’s defining crisis. I.e. not just that it’s important, and we need action, but the whole issue really is a massive game-changer.

I wasn’t going to post anything about the CRU emails story (apart from my attempt at humour), because I think it’s a non-story. I’ve read a few of the emails, and it looks no different to the studies I’ve done of how climate science works. It’s messy. It involves lots of observational data sets, many of which contain errors that have to be corrected. Luckily, we have a process for that – it’s called science. It’s carried out by a very large number of people, any of whom might make mistakes. But it’s self-correcting, because they all review each other’s work, and one of the best ways to get noticed as a scientist is to identify and correct a problem in someone else’s work. There is, of course, a weakness in the way such corrections are usually done to a previously published paper. The corrections appear as letters and other papers in various journals, and don’t connect up well to the original paper. Which means you have to know an area well to understand which papers have stood the test of time and which have been discredited. Outsiders to a particular subfield won’t be able to tell the difference. They’ll also have a tendency to seize on what they think are errors, but actually have already been addressed in the literature.

If you want all the gory details about the CRU emails, read the RealClimate posts (here and here) and the lengthy discussion threads that follow from them. Many of the scientists involved in the CRU emails comment in these threads, and the resulting picture of the complexity of the data and analysis that they have to deal with is very interesting. But of course, none of this will change anyone’s minds. If you’re convinced global warming is a huge conspiracy, you’ll just see scientists trying to circle the wagons and spin the story their way. If you’re convinced that AGW is now accepted as scientific fact, then you’ll see scientists tearing their hair out at the ignorance of their detractors. If you’re not sure about this global warming stuff, you’ll see a lively debate about the details of the science, none of which makes much sense on its own. Don’t venture in without a good map and compass. (Update: I’ve no idea who’s running this site, but it’s a brilliant deconstruction of the allegations about the CRU emails).

But one issue has arisen in many discussions about the CRU emails that touches strongly on the research we’re doing in our group. Many people have looked at the emails and concluded that if the CRU had been fully open with its data and code right from the start, there would be no issue now. This is of course, a central question in Alicia‘s research on open science. While in principle, open science is a great idea, in practice, there are many hurdles, including the fear of being “scooped”, the need to give appropriate credit, the problems of metadata definition and of data provenance, the cost of curation, and the fact that software has a very short shelf-life, and so on. For the CRU dataset at the centre of the current kerfuffle, someone would have to go back to all the data sources, and re-negotiate agreements about how the data can be used. Of course, the anti-science crowd just think that’s an excuse.

However, for climate scientists there is another problem, which is the workload involved in being open. Gavin, at RealClimate, raises this issue in response to a comment that just putting the model code online is not sufficient. For example, if someone wanted to reproduce a graph that appears in a published paper, they’ll need much more: the script files, the data sets, the parameter settings, the spinup files, and so on. Michael Tobis argues that much of this can be solved with good tools and lots of automated scripts. Which would be fine if we were talking about how to help other scientists replicate the results.

Unfortunately, in the special case of climate science, that’s not what we’re talking about. A significant factor in the reluctance of climate scientists to release code and data is to protect themselves from denial-of-service attacks. There is a very well-funded and PR-savvy campaign to discredit climate science. Most scientists just don’t understand how to respond to this. Firing off hundreds of requests to CRU to release data under the freedom of information act, despite each such request being denied for good legal reasons, is the equivalent of frivolous lawsuits. But even worse, once datasets and codes are released, it is very easy for an anti-science campaign to tie the scientists up in knots trying to respond to their attempts to poke holes in the data. If the denialists were engaged in an honest attempt to push the science ahead, this would be fine (although many scientists would still get frustrated – they are human too).

But in reality, the denialists don’t care about the science at all; their aim is a PR campaign to sow doubt in the minds of the general public. In the process, they effect a denial-of-service attack on the scientists – the scientists can’t get on with doing their science because their time is taken up responding to frivolous queries (and criticisms) about specific features of the data. And their failure to respond to each and every such query will be trumpeted as an admission that an alleged error is indeed an error. In such an environment, is it perfectly rational not to release data and code – it’s better to pull up the drawbridge and get on with the drudgery of real science in private. That way the only attacks are complaints about lack of openness. Such complaints are bothersome, but much better than the alternative.

In this case, because the science is vitally important for all of us, it’s actually in the public interest that climate scientists be allowed to withhold their data. Which is really a tragic state of affairs. The forces of anti-science have a lot to answer for.

Update: Robert Grumbine has a superb post this morning on why openness and reproducibility is intrinsically hard in this field

25. November 2009 · 7 comments · Categories: humour

(with apologies to Maurice Sendak)

The night Mike wore his lab coat,
and made scientific discoveries of one kind and another,
the denialists called him a fraudster
and Mike said: “I’ll prove you wrong!”
So they sent his emails to the media without any context.
That very night on his blog,
a jungle of obfuscating comments grew,
and grew,
until the discussions became enflamed,
and spilled out into the internet all around.
And an ocean of journalists tumbled by,
with a high public profile for Mike,
and he argued away through night and day,
and in and out of weeks,
and almost over a year,
everywhere the denialists are.
And whenever he came to a place where the denialists are,
they talked their terrible talking points,
and quoted their terrible quotes,
and showed their terrible cherry picking,
and demonstrated their terrible ignorance.
Until Mike said “Be still!”
and tamed them with his Nature trick
of showing them actual data sets without blinking once.
And they were baffled, and called him the biggest denialist of all,
and declared him king of the denialists.
“And now”, said Mike, “let the data speak for itself.”
And he sent the denialists off to look at the evidence without their talking points.
Then Mike, the king of all denialists said,
“I’m lonely”,
and wanted to be where someone actually appreciated rational discussion.
Then all around, from far away, across the world
He saw evidence of good solid scientific work.
So he said “I’ll give up arguing with the denialists”
But the denialists cried
“Oh please don’t stop – we’ll eat you up, we need you so”.
And Mike said “No!”
And the denialists talked their terrible talking points,
and quoted their terrible quotes,
and showed their terrible cherry picking,
and demonstrated their terrible ignorance.
But Mike stepped back into his research, and waved goodbye.
And worked on, almost over a year and in and out of weeks and through a day,
in the sanity of his very own lab.
Where he found his peer-reviewed papers waiting for him.
And they were all accepted.

(PS, if you’ve no idea what this is referring to, trust me, you’re better off not knowing)

Update: I should have added a link to an even better humorous response! And this one too!

Brad points out that much of my discussion for a research agenda in climate change informatics focusses heavily on strategies for emissions reduction (aka Mitigation) and neglects the equally important topic of ensuring communities can survive the climate changes that are inevitable (aka Adaptation). Which is an important point. When I talk about the goal of keeping temperatures to below a 2°C rise, it’s equally important to acknowledge that we’ve almost certainly already lost any chance of keeping peak temperature rise much below 2°C.

Which means, of course, that we have some serious work to do, in understanding the impact of climate change on existing infrastructure, and to integrate an awareness of the likely climate change issues into new planning and construction projects. This is, of course, what Brad’s Adaptation and Impacts research division focusses on. There are some huge challenges to do with how we take the data we have (e.g. see the datasets in the CCCSN), downscale these to provide more localized forecasts, and then figure out how to incorporate these into decision making.

One existing tool to point out is the World Bank’s ADAPT, which is intended to help analyze projects in the planning stage, and identify risks related to climate change adaptation. This is quite a different decision-making task from the emissions reduction decision tools I’ve been looking at. But just as important.

Yesterday, I posted that the total budget of fossil fuel emissions we can ever emit is 1 trillion tonnes of Carbon. And that we’ve burnt through about half of that since the dawn of industrialization. Today, I read in the Guardian that existing oil reserves may have been deliberately overestimated by the International Energy Agency. George Monbiot explains how frightening this could be, given the likely impact of lower oil supplies on food production. Madeleine Bunting equates the reluctance to discuss this with the head-in-the-sand attitude that preceded last year’s financial crisis. Looks like the more pessimistic of the peak oil folks may have had it right all along.

None of these articles however makes the link to climate change (Monbiot only mentions it in passing in response to comments). So, which problem is bigger, peak oil or climate change? Does one cancel out the other? Should I stop worrying about the trillionth tonne, if the oil simply doesn’t exist to get there?

A back of the envelope calculation tells me that more than half of the world’s estimated remaining reserves of fossil fuels have to stay buried in the ground if we are to stay within a trillion tonnes. Here’s the numbers:

  • Oil: The Energy Watch Group estimates there are 854 Gb (gigabarrels) of oil left, while industry official figures put it at well over 1200Gb). Let’s split the difference and say 1,000Gb (1×10^12). Jim Bliss calculates that each barrel of crude oil releases about 100kg of carbon. That gives us 0.1 trillion tonnes of Carbon from oil.
  • Coal: Wikipedia tells us there are just under 1 trillion tonnes of proved recoverable coal reserves, and that coal has a carbon intensity of about 0.8, so that gives us 0.8 trillion tonnes of Carbon from coal.
  • Natural Gas: The US EIA gives the world’s natural gas reserves as about somewhat over 6,000 trillion cubic feet, which converts to about 170 trillion cubic meters. Each cubic meter gives about 0.5kg Carbon, so we have 85 trillion kg, or 0.08 trillion tonnes of Carbon from gas.

That all adds up to about 1 trillion tonnes of carbon from estimated fossil fuel reserves, the vast majority of which is coal. If we want a 50:50 chance of staying below 2ºC temperature rise, we can only burn half this much over the next few centuries. If we want better odds, say a 1-in-4 chance of exceeding 2ºC, we can only burn a quarter of it.

Conclusion: More than one half of all remaining fossil fuel reserves must remain unused. So peak oil and peak coal won’t save us. I would even go so far as to say that the peak oil folks are only about half as worried as they should be!

Our paper, Engineering the Software for Understanding Climate Change finally appeared today in IEEE Computing in Science and Engineering. The rest of the issue looks interesting too – a special issue on software engineering in computational science. Kudos to Greg and Andy for pulling it together.

Update: As the final paper is behind a paywall, folks might find this draft version useful. The final published version was edited for journal house style, and shortened to fit page constraints. Needless to say, I prefer my original draft…