The highlight of the whole conference for me was the Wednesday afternoon session on Methodologies of Climate Model Confirmation and Interpretation, and the poster session the following morning on the same topic, at which we presented Jon’s poster.  Here’s my notes from the Wednesday session.

Before I dive in, I will offer a preamble for people unfamiliar with recent advances in climate models (or more specifically, GCMs) and how they are used in climate science. Essentially, these are massive chunks of software that simulate the flow of mass and energy in the atmosphere and oceans (using a small set of physical equations), and then couple these to simulations of biological and chemical processes, as well as human activity. The climate modellers I’ve spoken to are generally very reluctant to have their models used to generate predictions of future climate – the models are built to help improve our understanding of climate processes, rather than to make forecasts for planning purposes. I was rather struck by the attitude of the modellers at the Hadley centre at the meetings I sat in on last summer in the early planning stages for the next IPCC reports – basically, it was “how can we get the requested runs out of the way quickly so that we can get back to doing our science”. Fundamentally, there is a significant gap between the needs of planners and policymakers for detailed climate forecasts (preferably with the uncertainties quantified), and the kinds of science that the climate models support.

Climate models do play a major role in climate science, but sometimes that role is over-emphasized. Hansen lists climate models third in his sources of understanding of climate change, after (1) paleoclimate and (2) observations of changes in the present and recent past. This seems about right – the models help to refine our understanding and ask “what if…” questions, but are certainly only one of many sources of evidence for AGW.

Two trends in climate modeling over the past decade or so are particularly interesting: the push towards higher and higher resolution models (which thrash the hell out of supercomputers), and the use of ensembles:

  • Higher resolution models (i.e. resolving the physical processes over a finer grid) offer the potential for more detailed analysis of impacts on particular regions (whereas older models focussed on global averages). The difficulty is that higher resolution requires much more computing power, and the higher resolution doesn’t necessarily lead to better models, as we shall see…
  • Ensembles (i.e. many runs of either a single model, or of a collection of different models) allow us to do probabilistic analysis, for example to explore the range of probabilities of future projections. The difficulty, which came up a number of times in this session, is that such probabilities have to be interpreted very carefully, and don’t necessarily mean what they appear to mean.

Much of the concern is over the potential for “big surprises” – the chance that actual changes in the future will lie well outside the confidence intervals of these probabilistic forecasts (to understand why this is likely, you’ll have to read on to the detailed notes). And much of the concern is with the potential for surprises where the models dramatically under-estimate climate change and its impacts. Climate models work well at simulating 20th Century climate. But the more the climate changes in the future, the less certain we can be that the models capture the relevant processes accurately. Which is ironic, really: if the climate wasn’t changing so dramatically, climate models could give very confident predictions of 21st century climate. It’s at the upper end of projected climate changes where the most uncertainty lies, and this is the scary stuff. It worries the heck out of many climatologists.

Much of the question is to do with adequacy for answering particular questions about climate change. Climate models are very detailed hypotheses about climate processes. They don’t reproduce past climate precisely (because of many simplifications). But they do simulate past climate reasonably well, and hence are scientifically useful. It turns out that investigating areas of divergence (either from observations, or from other models) leads to interesting new insights (and potential model improvements).

Okay, with that as an introduction, on to my detailed notes from the session (be warned: it’s a long post). More »

I’m still only halfway through getting my notes from the AGU meeting turned into blog posts. But the Christmas vacation intervened. I’m hoping to get the second half of the conference blogged over the next week or so, but in the meantime, I thought I’d share these tidbits:

  1. A kid’s-eye view of the AGU. My kids (well, 2/3 of them) accompanied me to the AGU meeting and had great fun interviewing climate scientists. The interviews they videoed were great, but unfortunately the sound quality sucks (you know what noisy conference venues are like). I’ll see if we can edit them into something usable…
  2. I’m a geoblogger! On the Wednesday lunchtime, the AGU hosted a lunch for “geobloggers“, complete with a show-and-tell by each blogger. There’s a nice write up on the AGU blog of the lunch, including a snippet of Julie’s graphical summary with Serendipity right at the centre:

geoblogging lunch

Wednesday morning also saw the poster session “IN31B – Emerging Issues in e-Science: Collaboration, Provenance, and the Ethics of Data”. I was presenting Alicia‘s poster on open science and reproducibility:

Identifying Communication Barriers to Scientific Collaboration (click for fullsize)

Identifying Communication Barriers to Scientific Collaboration (click for fullsize)

The poster summarizes Alicia’s master’s thesis work – a qualitative study of what scientists think about open science and reproducibility, and how they use these terms (Alicia’s thesis will be available real soon now). The most interesting outcome of the study for me was the realization that innocent sounding terms such as “replication” mean very different things to different scientists. For example, when asked how many experiments in their field are replicated, and how many should be replicated, the answers are all over the map. One reason is that the term “experiment” can have vastly different meanings to different people, from a simple laboratory procedure that might take an hour or so, to a journal-paper sized activity spanning many months. Another reason is that it’s not always clear what it means to “replicate” an experiment. To some people it means following the original experimental procedure exactly to try to generate the same results, while to others, replication includes different experiments intended to test the original result in a different way.

Once you’ve waded through the different meanings, there still seems to be a range of opinion on the desirability of frequent replication. In many fields (including my field, software engineering) there are frequent calls for more replication, along with complaints about the barriers (e.g. some journals won’t accept papers reporting replications because they’re not ‘original’ enough). However, on the specific question of how many published studies should be replicated, an answer other than “100%” is quite defensible: some published experiments are dead-ends (research questions that should not be pursued further), and some are just bad experiments (experimental designs that in hindsight were deeply flawed). And then there’s the opportunity cost – instead of replicating an experiment for a very small knowledge gain, it’s often better to design a different experiment to probe new aspects of the same theory, for a much larger knowledge gain. We reflected on some of these issues in our ICSE’2008 paper On the Difficulty of Replicating Human Subjects Studies in Software Engineering.

Anyway, I digress. Alicia’s study also revealed a number of barriers to sharing data, suggesting that some of the stronger calls for open science and reproducibility standards are, at least currently, too impractical. At a minimum, we need better tools for capturing data provenance and scientific workflows. But more importantly, we need to think more about the balance of effort – a scientist who has spent many years developing a dataset needs the appropriate credit for this effort (currently, we only tend to credit the published papers based on the data), and perhaps even some rights to exploit the dataset for their own research first, before sharing. And for large, complex datasets, there’s the balance between ‘user support’ as other people try to use the data and have many questions about it, versus getting on with your own research. I’ve already posted about an extreme case in climate science, where such questions can be used strategically in a kind of denial of service attack. The bottom line is that while in principle, openness and reproducibility are important cornerstones of scientific process, in practice there are all sorts of barriers, most of which are poorly understood.

Alicia’s poster generated a huge amount of interest, and I ended up staying around the poster area for much longer than I expected, having all sorts of interesting conversations. Many people stopped by to ask questions about the results described on the poster, especially the tables (which seemed to catch everyone’s attention). I had a fascinating chat with Paulo Pinheiro da Silva, from UT El Paso, whose Cyber-Share project is probing many of these issues, especially the question of whether knowledge provenance and semantic web techniques can be used to help establish trust in scientific artefacts (e.g. datasets). We spent some time discussing what is good and bad about current metadata projects, and the greater challenge of capturing the tacit knowledge scientists have about their datasets. Also chatted briefly with Peter Fox, of Rensselaer, who has some interesting example use cases for where scientists need to do search based on provenance rather than (or in addition to) content.

This also meant that I didn’t get anywhere near enough time to look at the other posters in the session. All looked interesting, so I’ll list them here to remind me to follow up on them:

  • IN31B-1001. Provenance Artifact Identification and Semantic Tagging in the Atmospheric Composition Processing System (ACPS), by Curt Tilmes of NASA GSFC.
  • IN31B-1002. Provenance-Aware Faceted Search, by Deborah McGuinness et. al. of RPI (this was the work Peter Fox was telling me about)
  • IN31B-1003. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis, by Raj Singh of the Open Geospatial Consortium.
  • IN31B-1005. Ethics, Collaboration, and Presentation Methods for Local and Traditional Knowledge for Understanding Arctic Change, by Mark Parsons of the NSIDC.
  • IN31B-1006. Lineage management for on-demand data, by Mary Jo Brodzik, also of the NSIDC.
  • IN31B-1007. Experiences Developing a Collaborative Data Sharing Portal for FLUXNET, by Deb Agarwal of Lawrence Berkeley Labs.
  • IN31B-1008. Ignored Issues in e-Science: Collaboration, Provenance and the Ethics of Data, by Joe Hourclé, of NASA GSFC
  • IN31B-1009. IsoMAP (Isoscape Modeling, Analysis, and Prediction), by Chris Miller, of Purdue

Picking up from my last post on communicating with policymakers, the first session on Wednesday morning was on science literacy (or rather, lack of it) in America. The session was packed full of interesting projects.

Frank Niepold from NOAA kicked off the session, presenting the results from an NSF-led study on climate literacy. They have developed some draft goals for a national strategy, and a very nice booklet. The NOAA Climate Program Office has pointers to a useful set of additional material, and there is a fuller account of the development of this initiative in the AAAS interview with Frank. An interesting point was that literacy in this initiative means actionable stuff – a sufficient understanding of climate change to result in behavioural change. The brochures are interesting and useful, but it looks like they’ve got a long way to go to make it actionable.

Connie Roser-Renouf then presented a brilliant 15 minute summary of the fascinating study Global Warming’s Six Americas: An Audience Segmentation Analysis. The study sampled 2129 people from 40,000 respondents in a nationwide survey. Latent class analysis was used to cluster people according to their beliefs, attitudes, etc. [I like this tidbit: 2% of respondents said they’d never heard of global warming]. I already posted a little about this on Sunday, but I find the study sufficiently fascinating that it’s worth looking at in more depth.

Six Americas study, figure 2

Remember, the size of the circles represent the proportion of each group in the sample, and the six groups were identified by cluster analysis on a range of attitudes. Quite clearly the vast majority of respondents understand that global warming is happening, with the dismissives being a notable outlier.

Six Americas Study, figure 12

Interesting that a majority also understand that it will be harmful to people in the US (i.e. not polar bears, not people in other countries), and pretty soon too, although there is a big difference in understanding the urgency across the different groups.

The next graph was interesting, partly because of the vast differences across the groups, but also, as Connie pointed out, because the disengaged group answered 100% “don’t know” and this never happens in social science research (which leads me to speculate that these clusters aren’t like normal social groupings with very fuzzy boundaries, but are more like religious groups, with explicitly affirmed belief sets):

Six Americas Figure 11

But of course, the scientific literacy question is about how well people understand the causes of climate change. The interesting thing about figure 8 is that many respondents wrote in an additional response mode (“caused by both human activities and natural changes”) which wasn’t in the questionnaire. If it had been, it’s likely that more people would have selected this response, and it’s arguably the most accurate scientifically, if you take a broad view of earth system processes:

Six Americas Figure 8

However, there’s a significant number of people in all groups who believe there is a lot of disagreement among scientists (they should come to the AGU meeting, and see if they can find any scientists who who think that human-induced global warming is not happening!):

Six Americas, Figure 9

But note that there is a widespread trust in scientists as a source of information (even the dismissives don’t go so far as to actively distrust scientists):

6 Americas, figure 35

There was strong support for making global warming a priority for the US, but some interesting disagreement about suitable policy responses. For example, very mushy support for cap-and-trade:

Six Americas, figure 22

But very strong support for rebates on purchasing solar panels and fuel-efficient vehicles:

Six Americas, figure 21

As far as climate science literacy is concerned, the question is how to reach the groups other than the alarmed and concerned. These are the people who don’t visit science museums, and who don’t seek out opportunities to learn about the science.

The next talk, by Steven Newton of the National Centre for Science Education, was equally fascinating. NCSE is one of the leading organizations fighting the creationists in their attempts to displace the teaching of evolution in schools (and host of Project Steve, of which I’m Steve #859). Steven presented an analysis of how creationists reacted to the story of the CRU emails over the last few weeks. He’s been monitoring the discussion boards at the Discovery Institute, a group who support the idea of intelligent design, avoid making overt biblical references and prefer to portray themselves as a scientific group. First, since the story broke, there have been doubled the normal number of posts on their blog, with some very strong language: “toxic leftist-atheist ideology”. “a cabal”, “reaction of Darwinist and of global warming scientists to even the most mild skepticism is remarkably vicious”. “there will be an accounting”, “massive defunding of organized science”. This is interesting, because it is exactly what they say about evolution!

Steven argues that it’s clearly denialism. These people don’t think that science as a methodology works (in part, because it refused to acknowledge the role of a supernatural agency). They reject the methods of science (e.g. radiometric dating for evolution; computer modeling for global warming). They reject the data (e.g. observable evolution; observable climate data). They have very fixed mindsets (e.g. “no evidence will prove evolution”; “no evidence will prove global warming”). And they believe in a conspiracy (e.g. the “establishment” hiding the truth; stifling dissent).

This analysis leads to questions about how schools treat climate change, given this is now so contentious for evolution, and a concern that climate science education will become a battleground in the way that evolution did. Only 30 states have standards for teaching global warming in schools, and these are mostly focussed on impacts, not causes. Out of the 30 different statewide standards, only 7 mention fossil fuels, 8 mention mitigation, 17 mention mechanisms, and none mention sea level rise (and I’ll bet none mention ocean acidification either).

Some of these states have clauses that explicitly link topics such as evolution and global warming. For example, the Louisiana Science Education Act (LSEA) allows teachers to bring into the curriculum materials not generally approved, including “on topics of evolution, global warming, human cloning, …”. In Texas, the law mandates that textbooks must promote a free market approach (which would make discussion of cap-and-trade difficult), and textbooks must “Analyze and evaluate different views on the existence of global warming”. The wording here is crucial – it’s about views on the existence of global warming, not the details of the science.

Next up, Dennis Bartels, the executive director of the Exploratorium in San Francisco, talked about the role of science museums in improving climate literacy. He pointed out that it’s hard to explain to kids and adults about climate change – the topic is so complex. One problem is math literacy. Calculus is a capstone course in most high schools, but you need some basic calculus to understand the key concepts of emissions and concentrations of greenhouse gases, so it can’t be taught well in the high school curriculum. Also, kids aren’t ever exposed to thinking about systems. Adults are still invested in the myth that science is about truth, rather than understanding that science is a process. Hence, the public don’t get it, and are especially put off by the typical back and forth you get with the “received wisdom” approach (e.g. “Coffee is good for you”, then “coffee is bad for you”, and eventually “those damn scientists don’t know what they’re doing”).

Science centres that have tackled climate change have tended to fall into a proselytization mode, rather than teaching science as a process. The Exploratorium is exploring ways of overcoming this. For example, they’ve run workshops to bring the scientists who do polar expeditions together for a week, getting them to show and tell how they do their research, and then repackaging this for science centres across the country. The aim is to tell the story of how the science is done. Most of the best work in science centres isn’t the exhibits, it’s mediated sessions and live presentations. This contributes to getting people to think for themselves (rather than telling them what to think).

Another approach is to get people to think through how they know what they know. For example, ask people how they know that the earth is round. Did you directly observe it? Run and experiment to show it? Learn it in school? Get it from some authority? Then compare answers with others, and come to realise that most of what you know about the world you get from others (not through direct experience/experimentation). The role of trusted sources of scientific expertise is taken for granted for some areas of science, but not for others. You can then challenge people to think about why this is.

In the question session, somebody asked how do you reach out to people who don’t come to museums (e.g. the dismissive and disengaged)? Dennis’ answer pointed to the European cafe scientifique idea – after the soccer game, do an hour of science in a cafe.

Tom Bowman, a graphic designer, talked about how we translate scientific conclusions about risk for the public. His approach is to identify key misconceptions, and then re-adapt key graphic figures (eg from IPCC) to address them. Then, test these with difference audiences (he gave a long list of business conferences, public lectures, informal science institutions where he has done this).

Key misunderstandings seem to be (esp from Connie’s talk) about the immediacy of impacts (both in time and scale), the scale of the mitigation required, whether viable solutions exist, and whether we make effective choices. Hence, Tom suggested to start not by explaining the greenhouse effect, but by focussing on the point at which people make decisions about risk. For example, Limit + Odds = Emissions Trajectory (i.e. figure out what temperature change limit you’ve comfortable with, and what odds of meeting it you’d like, and design emissions trajectories accordingly). Then from there work out options and tradeoffs.

Take, for example, the table in the IPCC report on emissions scenarios – to a graphic designer, this is a disaster – if the footnotes are as big as the figure, you’ve failed. So what are the alternatives? Tom’s designed a graphic with a thermometer (with the earth as the bulb), on which he plots difference scenarios (I can’t find the graphic online, but this is similar). Points on the thermometer can then be used to show impacts at that level of global warming. This graphic has worked so well that everyone he has presented it to wants a copy (er, add me to the list, Tom).

However, sometimes things don’t work out. He also tried to map peak year emissions onto the thermometer. This doesn’t work so well, so instead he’s using the trillionth tonne analysis, and then the “ski slope” figure from the Copenhagen diagnosis:

Emissions pathways to give 67% chance of limiting global warming to 2ºC

Emissions pathways to give 67% chance of limiting global warming to 2ºC

Tom’s still working on the graphics for this and hopes to have full set by February, looking for ways to test them out.

Finally, talking about making climate science accessible, Diane Fisher presented NASA’s Climate Kids website, which is hosted by NASA’s webby award-winning Eyes on the Earth site. A key challenge is to get kids thinking about this subject right at the beginning of the educational process. In designing the site, they had some debate on whether to do the “most scientists agree” approach. In the end, they decided to take the “this is the way it is” approach, which certainly reduces the potential to confuse. But there’s a difficult balance between telling kids how things are, and scaring them to death. A key principle for ClimateKids was to make sure to give them positive things to do, even if it’s just simple stuff like reusable water bottles and riding your bike. More interestingly, the site also covers some ideas about career choices, for longer term action. I like this: get them to grow up to be scientists.

There has been a strong thread at the AGU meeting this week on how to do a better job of communicating the science. Picking up on the workshop I attended on Sunday, and another workshop and townhall meeting that I missed, came two conference sessions, the first, on Tuesday, on Providing Climate Policy Makers With a Strong Scientific Base, and the second on Wednesday, on Education and Communication for Climate Literacy and Energy Awareness.

I’ll get to the speakers at these sessions in a moment. But first, I need to point out the irony. While the crazier sections of the blogosphere are convinced that scientists are involved in some massive conspiracy with policymakers hanging on their every word, the picture from within the scientific community is very different. Speaker after speaker expressed serious concerns that their work is being ignored, and that policymakers just don’t get it. Yet at the same time, they seemed to have very little clue about the nature of this communication problem, nor how to fix it. In fact, a number of speakers came across as charmingly naive, urging the audience to put a bit more effort into explanation and outreach. The notion that these completely guileless scientists could be conspiring to hide the truth is truly ludicrous.

So why so many speakers and sessions focussing on science communication? Why the sense of failure? Well, on the one hand there was the steady stream of news about a lack of progress in Copenhagen, building on a year of little progress in the US Congress. On the other hand is the widespread sense in the AGU community that the IPCC AR4 is out of date, with the new science of the last 4-5 years dramatically removing some of the remaining uncertainties (see Alley’s talk for one example), and, in nearly every case, confirming that the AR4 projections understate the magnitude of the expected warming and its impacts. Perhaps the best summary of this is the Copenhagen Diagnosis, which documents how…

…several important aspects of climate change are already occurring at the high end, or even beyond, the expectations of just a few years ago. […] global ice sheets are melting at an increased rate; Arctic sea ice is thinning and melting much faster than recently projected, and future sea-level rise is now expected to be much higher than previously forecast.

As an aside, the European EGU meeting in April lacked this strong sense of communication failure. Could this be a mark of the peculiarly American (oh, and Canadian) culture of anti-science?  of lack of science literacy? Or perhaps a sense that the US and Canada are lagging far behind on developing science-based policy?

Anyway, on with the sessions. For the session on “Providing Climate Policy Makers With a Strong Scientific Base”, the first speaker was David Carlson, director of the International Polar Year (IPY) programme office. He argued that scientists are definitely not having the influence on policy that they should and the weakness of any likely agreement in Copenhagen will prove this. He described a number of examples of science communication in policymaking, analyzing whether the transmitters (scientists) and/or the receivers (policymakers) did a good job.

The first example was on ocean fertilization. The Solas study demonstrated that ocean fertilization will be ineffective. This message was received by the IMO convention on marine pollution, which issued a policy statement noting with concern the potential for negative impacts from ocean fertilization. There was one small problem – the policy was widely interpreted to mean all ocean fertilization is bad, which then prevented small scale experiments as part of IPY. So, the transmitter was good, receiver even better, but this led to unintended consequences.

Another example was a US senate heraring that David appeared at earlier this year, for the Kerry-Boxer bill. The hearing focused on the arctic, and included three senators, several senior scientists, lobbyists, a consultant, and a representative from an NGO. (note: David wasn’t complimentary about the roles of the non-scientists: the lobbyists were there to promote appropriations, the consultant was there to promote his previous work, and the NGO rep claimed “the dog ate my homework”). As the chair didn’t hear a clear message from the participants, the hearing was held open, and they were invited submit a joint report in 48 hours. David circulated a draft summary within 8 hours to others, but got no response. In the end, it appears each participant submitted their own text. The Kerry-Boxer bill was introduced in the senate a few weeks later, containing no language at all about the impacts on the arctic. His conclusion is that a failure to work together represented a substantial lost opportunity. And this theme (of acting as individuals, and failing to coordinate, cropped up again and again in the talks).

The next speaker was Ed Struzik, science journalist and author. Ed characterized himself as the entertainer for the session, and laid on a beautiful series of photographs to illustrate his point that climate change is already having a dramatic impact on the acrtic, but it’s a very complicated story. For example, even just defining what constitutes the arctic is difficult (North of arctic circle? Area where warmest summer month averages 9ºC?). It’s a very varied geography (glaciers, tundra, forest, smoking hills, hot springs, river deltas, lakes, etc). It’s a sacred place “where god began” for native peoples. And lots of fresh clean water: big rivers. Arctic wildlife. And while many of the impacts of climate change are frighteningly negative, not all are: For example the barren ground grizzlies will do well – warming will mean they don’t need to hibernate as long. And more ominously, a warming arctic creates opportunities to exploit new fossil fuel reserves, so there are serious economic interests at stake.

James Mueller from the office of US senator Maria Cantwell gave an insider’s view of how the US senate works (and a little plug for Cantwell’s CLEAR Act). Unlike most speakers in the session, he argued that scientists have done a good job with science communication, and even claimed that behind closed doors, most senators would agree climate change is a big problem; the challenge is reaching agreement on what to do about it. He suggested scientists need a more realistic view of the policy process – it takes years to get major policy changes through (cf universal healthcare), and he takes it as a good sign that the House vote passed. He also argued that less than twenty years from the first IPCC report, the science is well-developed and accepted by the majority of non-scientists. In contrast, energy economists have a much wider disagreements, and are envious of the consensus among climate scientists.

James then described some difficulties: the drive towards specialization in the science community has led to isolation among climate scientists and policy experts. There’s a lack of math and science literacy among staff of the politicians. And, most importantly, the use of experts with “privileged information” runs counter to democracy, in that it’s difficult to maintain accountability in technocratic institutions, but difficult to get away from the need for specialist expertise.

However, I couldn’t help feeling that he missed the point, and was just giving us political spin to excuse inaction. If the vast majority of senators truly understood the science, there would have had no problem getting a strong bill through the senate this year. The problem is that they have a vague sense there is a problem, but no deep understanding of what the science really tells us.

Amanda Staudt from the National Wildlife Foundation dissected the way in which IPCC recommendations got translated into the details of the Waxman-Markey bill and the Kerry-Boxer bill (drawing on the analysis of the World Resources Institute):

WRI analysis of the various bills of the 111th Congress

WRI analysis of the various bills of the 111th Congress

The process looks like this:

(1) Politicians converged on 2°C temperature rise as an upper limit considered “safe”. The IPCC had a huge influence on this, but (as I noted above) recent advances in the science were not included.

(2) Then policy community needed to figure out what level of greenhouse gas emissions would achieve this. They looked at table 5.1 in the IPCC Synthesis report:

IPCC AR4 Synthesis report table 5.1

…and concentrated on the first row. The fact that they focussed on the first row is a remarkable achievement of the science, but note that these were never intended to be interpreted as policy options; the table was just a summary of six different scenarios that had been considered in the literature. What was really needed (and wasn’t available when the IPCC report was put together) is a more detailed analysis of even more aggressive emissions scenarios; for example, the study that Meinhausen did earlier this year that gave a 50:50 chance of staying within 2°C for 450ppm stabilization.

(3) Then they have to determine country-by-country emissions reductions, which is what the “Bali box” summarizes.

(4) Then, what will the US do? Much of this depends on the US Climate Action Partnership (USCAP) lobbying group. The USCAP blueprint recommended an 80% reduction by 2050 (but compromised on a 1.3% reduction by 2020 – i.e. a slower start to the reductions.

The key point is that there was little role for the scientists in the later steps. Congress gets exposed to the science through a number of limited mechanisms: Hearings; 2 page summaries; briefings sponsored by interested parties (The CEI runs a lot of these!); lobby visits (there were a record number of these this year, with energy companies vastly outspending green groups); fact sheets; and local news stories in politicians’s own districts.

Amanda’s key point was that scientists need to be more involved throughout the process, and they can do this by talking more to their congresspersons, working with NGOs who are already doing the communications, and to get working on analyzing the emissions targets ready for the next stage of refinements to them. (Which is all well and good, but I don’t believe for one minute that volunteerism by individual scientists will make the slightest bit of difference here).

Chris Elfring of the National Academics Board on Atmospheric Sciences and Climate (BASC) spoke next, and gave a summary of the role of the National Academy of Sciences, including a history of the Academies, and the use of the NRC as it’s operational arm. The most interesting part of the talk was the report America’s Climate Choices, which is being put together now, with four panel reports due in early 2010, and a final report due in summer 2010.

All of the above speakers gave interesting insights into how science gets translated into policy. But none really convinced me that they had constructive suggestions for improving things. In fact the highlight of the session for me was a talk that didn’t seem to fit at all! Jesse Anttila-Hughes, a PhD student at Columbia, presented a study he’s doing on whether people act on what they believe about climate change. To assess this, he studied how specific media stories affects energy company stock prices.

He looked at three specific event types: Annoucements of record a temperature in the previous year according to NASA’s GISS data (6 such events in the last 15 years); Annoucements of sea-ice minima in the Arctic (3 such events); and annoucements of Ice shelf collapses in the Antarctic (6 such events). He then studied how these events affect the stock prices of S&P500 energy companies, which together have about a $1.2 trillion market capitalization, with 6.5 million trades per day. I.e. pretty valuable and busy stocks. The methodology, a version of CAPM, is a bit complicated, because you have to pick a suitable window (e.g. 10 days) to measure the effect, and allow a few days before the story hits the media, to allow for early leaks. And you have to compare how the company is doing compared to markets in general each day.

Anyway, the results are fascinating. Each announcement of a record hot year causes energy stocks across the board drop by 3% (which, given the size of their market capitalization, is a huge loss of value). However, when ice shelves collapse you get the opposite response – a 2.5% rise in energy stocks. And there is no significant reaction to reports about sea ice minima (despite a number of scientists regarding this as a crucial early warning signal).

The interpretation seems pretty straightforward. Record temperatures are the purest signal of climate change, and hence cause investors to pay attention. Clearly, investors are taking the risk seriously, anticipating significant regulation of fossil-fuel energy sources, and have been doing so for a long time (at least back into the mid-90s). On the other hand, collapsing ice sheets are seen as an opportunity, in that they might open up more areas for oil and gas exploration, and it seems that investors expect this to outweigh (or precede) any climate change legislation. And sea ice minima, they probably don’t understand, or end up as a mixed signal with exploration opportunities and regulation threats balancing out.

More detailed followup studies of other such news stories would be fascinating. And that’s really got me really thinking about how to effect behaviour changes…

Given the terrible internet connection and complete lack of power outlets in most meeting rooms, I’ve been reduced to taking paper and pencil notes, so I’m way behind in the blogging. So I’ll console myself with a quick tour of everyone else’s blogs about this AGU meeting:

First, and foremost, the AGU press office is putting together its own blog, coordinated by Mohi Kumar. It does a much better job of capturing the highlights than I do, partly because the staff writers can sample across all the sessions, and partly because they’re good at summarizing (rather than my own overly-detailed blow-by-blow accounts of talks).

There’s lots of talk here about impacts, especially on water. Water, Water Everywhere–Except There, and There, and There is a nice summary of some of the hydrology sessions, with the new results from GRACE being an obvious highlight. And the session on Climate Change in the West covered some pretty serious impacts on California.

But the biggest buzz so far at the meeting seems to be science literacy and how to deal with the rising tide of anti-science. Capitol Hill Needs Earth Scientists, reports on a workshop on communicating with Congress, and Talking about Climate: A Monday Night Town Hall Meeting tackled how to talk to the press. Both of these pick up on the idea that the science just isn’t getting through to the people who need to know it, and we have to fix this urgently. I missed both of these, but managed to attend the presentations this morning on science literacy: Can Scientists Convince the Growing Number of Global Warming Skeptics?, which included a beautifully clear and concise summary of the Six Americas study. I’ll post my own detailed notes from this session soon. The shadow of the recently leaked CRU emails has come up a lot this week, in every case as an example of how little the broader world understands about how science is done. Oh, if only more people could come to the AGU meeting and hang out with climate scientists. Anyway, the events of the last few weeks lent a slight sense of desperation to all these sessions on communicating science.

And here’s something that could do with getting across to policymakers – a new study by Davis and Caldeira on consumption-based accounting – how much of the carbon emissions of the developing world are really just outsourced emissions from the developed world, as we look for cheaper ways to feed our consumption habits.

But there are good examples of how science communication should be done. For example, the Emiliani Lecture and the Tough Task of Paleoceanographers, is a great example of explaining the scientific process, in this case an attempt to unravel a mystery about changes in ocean currents around the Indonesian straits, due to varying El Nino cycles. The point of course, is that scientists are refreshingly open about it when they discover their initial ideas are wrong.

And last, but not least, everyone agrees that Richard Alley’s lecture on CO2 in the Earth’s History was the highlight of the meeting so far, even if it was scheduled to clash with the US Climate Change Research Program report on Impacts.

Some more highlights:

Yesterday afternoon, I managed to catch the Bjernes Lecture, which was given by Richard B. Alley: “The biggest Control Knob: Carbon Dioxide in Earth’s Climate History”. The room was absolutely packed – standing room only, I estimated at least 2,000 people in the audience. And it was easy to see why – Richard is a brilliant speaker, and he was addressing a crucial topic – an account of all the evidence we have of the role of CO2 throughout prehistory.
By way of introduction, he pointed out how many brains are in the room, and how much good we’re all doing. Although he characterizes himself as not being an atmospheric scientist, except perhaps by default, but as he looks more and more at paleo-geology, it becomes clear how important CO2 is. He has found that CO2 makes a great organising principle for his class on the geology of climate change at Penn State, because CO2 keeps cropping up everywhere anyway. So, he’s going to take us through the history to demonstrate this. His central argument is that we have plenty of evidence now (some of it very new) that CO2 dominates all other factors, hence “the biggest control knob” (later in the talk he extended the metaphor by referring to other forcings as fine tuning knobs).
He also pointed out that, from looking at the blogosphere, it’s clear we live in interesting times, with plenty of people WILLING TO SHOUT and distort the science. For example, he’s sure some people will distort his talk title because, sure, there are other things that matter in climate change than CO2. As an amusing aside, he showed us a chunk of text from an email sent to the administrators at his university, on which he was copied, the gist of which is that his own research proves CO2 is not the cause of climate change, and hence he is misrepresenting the science and should be dealt with severely for crimes against the citizens of the world. And then he points out a fundamental error of logic in the first sentence of the email, to illustrate the level of ignorance about CO2 that we’re faced with.
So the history: 4.6 billions years ago, the suns output was lower (approx 70% of today’s levels), often referred to as the faint young sun. But we know there was liquid water on earth back then, so there must have been a stronger greenhouse effect. Nothing else explains it – orbital differences for example weren’t big enough. The best explanation for the process so far is the Rock-Weathering Thermostat. When it’s too cold, CO2 builds up to warm up. Turn up the temperature, and the sequestration of CO2 in the rocks goes faster. This is probably what has kept the planet in the right range for liquid water and life for 4 billion years.
But can we demonstrate it? The rock thermostat takes millions of years to work, because the principle mechanism is geological. One consequence is that the only way to get to a “snowball earth” is that some other cause of change has to happen fast – faster than the rock-themostat effect.
One obvious piece of evidence is in the rock layers. Glacial layers are always covered from above by carbonate rocks, hence carbonation follows icing. This shows part of the mechanism. But to explore this properly, we need good CO2 paleo-barometers. The gold standard is ice core record. So far the oldest ice core record is 800,000 years, although we only have one record this old. Several records go back 450K years, and many more shorter records. The younger samples all overlap, giving some confidence that they are correct. We also now know a lot about how to sort out ‘good’ ice core record from poor (contaminated) ones.
But to back this up, there are other techniques with independent assumptions (but none as easy to analyze as ice cores). When they all agree, this gives us more confidence in the reconstructions. One example: growth of different plant species – higher CO2 gives preference to certain species. Similarly, different ways in which carbonate shells in the ocean grow, depending on pH of the ocean (which in turn is driven by atmospheric concentrations of CO2). Also the fossil-leaf stomata record. Stomata are the pores in leaves that allow them to breathe. Plants grow leaves with more pores when there is low CO2, to allow them to breathe better, and less when there is more CO2, to minimize moisture loss.
So, we have a whole bunch of different paths, none of which are perfect, but together work pretty well.
Now what about those other controllers, beyond the rock-thermostat effect? CO2 is raised by:
the amount of CO2 coming out of volcanoes
slower weathering of rock
less plant activity
less fossil burial.
Graph of reconstruction of CO2 over 400Ma to present. With Ice coverage mapped on (measured as how far down towards the equator the ice reaches. (Jansen 2007, fig 6.1).
251 million years ago, pretty much every animal dies – 95% of species wiped out. End-permian extinction. Widespread marine green sulfur bacteria that use H2S for photosynthesis. Hydrogen sulphide as a result kills most other life off. And it coincides with a warm period. Probably kicked off by greater vulcanisms (siberian traps). When the ocean is warm, it’s easy to starve it of oxygen; when it’s cold it’s well oxygenated.
The mid-cretaceous “saurian sauna” – no ice at sea level at poles. Again, CO2 is really high again. High CO2 explains the warmth, although the models make it a little too warm at the CO2 levels.
One more blip before the ice ages. Co2 didn’t kill the dinosaurs, a meteorite did. Paleocene-eocene thermal maximum. Big temperature change. Already hot, world gets even hotter. Most sea-floor life dies out. Acidic ocean. Models have difficulty simulating as much warming. Happens very fast, although the recovery process matches our carbon cycle models very well. Shows up everywhere: e.g. leaf damage in fossil leaves at PETM.
But a mystery: temperature and CO2 highly correlated, with no other way to explain it. Occasionally there were places where temperature changes did not match CO2 changes. But over last couple of decades, these have all gone, as we’ve refined our knowledge of the CO2 record. The mismatches have mostly dissapeared.
2 years would have said something wrong in miocene, but today looks better. Two years ago got new records that improve the match. Two weeks ago, Tripati et al published a new dataset that agrees even better. So, two years ago, Miocene anomalies looked important, now not so clear – looks like CO2 and temperature do track.
The changes of temperature were predicted 50 years ahead of data that orbital issues were the cause. But what do we say to people who say the lag proves current warming isn’t caused by CO2.
Temperature never goes far without the CO2, and vice versa, but sometimes one lags the other by about 2 centuries. Analogy: credit card interest lags debt. By the denialist logic, because interest lags debt then I never have to worry about interest and the credit card company can never get me. Simple numerical model demonstrates that interest causes the debt. So, it’s basic physics. The orbits initially kick off the warming, but the CO2 then kicks in and drives it.
Explanation of ice-age cycling: A David Archer cocktail.
So, CO2 explains almost all the historical temperature change.
What’s left:
– Solar irradiance changes, volcanic changes. When these things change, we do see the change in the temperature record. For solar changes, there clearly aren’t many. (“if volcanoes could get organised, they could rule the world” – luckily they aren’t organised). Occasionally several volcanoes erupting together makes a bigger change, but again a rare even.
Solar changes look much more like a fine tuning knob, rather than a major control.
40,000 years ago the magnetic field stopped, letting in huge amounts of cosmic rays, but the climate ignored it. Hence, at best a fine tuning knob.
Space dust hasn’t changed much over time and there isn’t much of it.
What about climate sensitivity?
Sensitivity from models matches the record well (approx 3C per doubling of CO2).
Interesting experiment – Royer et al Nature 29 March 2007 vol 446.
Hence paleoclimate says more extreme claims about sensitivity must be wrong.
Hence, if CO2 doesn’t warm, then we have to explain why the physicists are stupid, and then we still have no other explanation. If there is a problem it is that occasionally the world seems a little more sensitive to CO2 than the models say.
Lots of possible fine-tuning knobs that might explain these – lots of current research looking into it.
Oh, and this is a global story not a regional one. Lots of local effects on regional climate.
Note that most of these recent discoveries haven’t percolated to IPCC yet. Current science says CO2 is the most important driver of climate throughout the earth’s history.
Q: If we burn all the available fossil fuel reserves, where do we get to? If you burn it all at once, some chance of getting above the cretaceous level, but lots of uncertainty, including how much reserves there really are. In a “burn it all” future, it’s likely to get really hot – +6 to 7C.
Q: We know if we stop emitting, it will slow down global warming. But from geology, what do we know about removal? A: Anything that increases weatherabilty of rocks. Can we make it go fast enough to make a difference, at an economic level. Eg how much energy would we need to do it – dig up shells from ocean bed and allow them to carbonate. Probably easier to keep it out of the air than to take it out of the air.
Q: is there a relationship between ocean acidity and atmospheric CO2.
Q about feedbacks. As we put up more CO2, the oceans take up about half of it. As the world gets warmer, the ability of the ocean to buffer that reduces. Biggest concern is probably in changes in the arctic soils. Methane on the seafloor. As you go from decades to centuries, the sensitivity to CO2 goes up a little, because of these amplfying feedbacks.

Yesterday afternoon, I managed to catch the Bjerknes Lecture, which was given by Richard B. Alley: “The biggest Control Knob: Carbon Dioxide in Earth’s Climate History”. The room was absolutely packed – standing room only, I estimated at least 2,000 people in the audience. And it was easy to see why – Richard is a brilliant speaker, and he was addressing a crucial topic – an account of all the lines of evidence we have of the role of CO2 in climate changes throughout prehistory.

[Update: the AGU has posted the video of the talk]

By way of introduction, he pointed out how many brains are in the room, and how much good we’re all doing. Although he characterizes himself as not being an atmospheric scientist, except perhaps by default, but as he looks more and more at paleo-geology, it becomes clear how important CO2 is. He has found that CO2 makes a great organising principle for his class on the geology of climate change at Penn State, because CO2 keeps cropping up everywhere. So, he’s going to take us through the history to demonstrate this. His central argument is that we have plenty of evidence now (some of it very new) that CO2 dominates all other factors, hence “the biggest control knob” (later in the talk he extended the metaphor by referring to other forcings as fine tuning knobs).

He also pointed out that, from looking at the blogosphere, it’s clear we live in interesting times, with plenty of people WILLING TO SHOUT and distort the science. For example, he’s sure some people will distort his talk title because, sure, there are other things that matter in climate change than CO2. As an amusing aside, he showed us a chunk of text from an email sent by an alumnus to the administrators at his university, on which he was copied, the gist of which is that Alley’s own research proves CO2 is not the cause of climate change, and hence he is misrepresenting the science and should be dealt with severely for crimes against the citizens of the world. To the amusement of the audience, he points out a fundamental error of logic in the first sentence of the email, to illustrate the level of ignorance about CO2 that we’re faced with. Think about this: an audience of 2,000 scientists, all of whom share his frustration at such ignorant rants.

So the history: 4.6 billions years ago, the sun’s output was lower (approx 70% of today’s levels), often referred to as the faint young sun. But we know there was liquid water on earth back then, and the only thing that could explain that is a stronger greenhouse effect. Nothing else works – orbital differences, for example, weren’t big enough. The best explanation for the process so far is the Rock-Weathering Thermostat. CO2 builds up in the atmosphere over time from volcanic activity. As this CO2 warms the planet through the greenhouse effect, the warmer climate increases the chemical weathering of rock, which in turn removes carbon dioxide through the formation of calcium carbonate, that gets washed into the sea and eventually laid down as sediment. Turn up the temperature, and the sequestration of CO2 in the rocks goes faster. If the earth cools down this process slows, allowing CO2 to build up again in the atmosphere. This is process is probably what has kept the planet in the right range for liquid water and life for most of the last 4 billion years.

But can we demonstrate it? The rock thermostat takes millions of years to work, because the principle mechanism is geological. One consequence is that the only way to get to a “snowball earth” (times in the Cryogenian period when the earth was covered in ice even down to the tropics) is that some other cause of change has to happen fast – faster than the rock-themostat effect.

An obvious piece of evidence is in the rock layers. Glacial layers are always covered from above by carbonate rocks, showing that increased carbonation (as the earth warmed) follows periods of icing. This shows part of the mechanism. But to explore the process properly, we need good CO2 paleo-barometers. The gold standard is ice core record. So far the oldest ice core record is 800,000 years, although we only have one record this old. Several records go back 450,000 years, and there are many more shorter records. The younger samples all overlap, giving some confidence that they are correct. We also now know a lot about how to sort out ‘good’ ice core record from poor (contaminated) ones.

But to back the evidence from the ice cores, there are other techniques with independent assumptions (but none as easy to analyze as ice cores). When they all agree, this gives us more confidence in the reconstructions. One example: growth of different plant species – higher CO2 gives preference to certain species. Similarly, different ways in which carbonate shells in the ocean grow, depending on pH of the ocean (which in turn is driven by atmospheric concentrations of CO2). Also the fossil-leaf stomata record. Stomata are the pores in leaves that allow them to breathe. Plants grow leaves with more pores when there is low CO2, to allow them to breathe better, and less when there is more CO2, to minimize moisture loss.

So, we have a whole bunch of different paths, none of which are perfect, but together work pretty well. Now what about those other controllers, beyond the rock-thermostat effect? CO2 is raised by:

  • the amount of CO2 coming out of volcanoes
  • slower weathering of rock
  • less plant activity
  • less fossil burial.

He showed the graph reconstructing what we know of CO2 levels over the last 400 million years. Ice coverage is shown on the chart as blue bars, showing how far down towards the equator the ice reaches, and this correlates with low CO2 levels from all the different sources of evidence. 251 million years ago, pretty much every animal dies – 95% of marine species wiped out, in the end-permian extinction. Probable cause: rapid widespread growth of marine green sulfur bacteria that use H2S for photosynthesis. The hydrogen sulphide produced as a result kills most other life off. And it coincides with a very warm period. The process was probably kicked off by greater vulcanism (the siberian traps) spewing CO2 into the atmosphere. When the ocean is very warm, it’s easy to starve it of oxygen; when it’s cold it’s well oxygenated. This starvation of oxygen killed off most ocean life.

Fast forward to the mid-cretaceous “saurian sauna”, when there was no ice at sea level at poles. Again, CO2 is really high again. High CO2 explains the warmth (although in this case, the models tend to make it a little too warm at these CO2 levels). Then there was one more blip before the ice ages. (Aside: CO2 is responsible for lots of things, but at least it didn’t kill the dinosaurs, a meteorite did). The paleocene-eocene thermal maximum meant big temperature changes. It was already hot, and the world gets even hotter. Most sea-floor life dies out. Acidic ocean. This time, the models have difficulty simulating this much warming. And it happened very fast, although the recovery process matches our carbon cycle models very well. And it shows up everywhere: e.g. leaf damage in fossil leaves at PETM.

But for many years there was still a mystery: The temperature and CO2 levels are highly correlated throughout the earth’s history, and with no other way to explain the climate changes. But occasionally there were places where temperature changes did not match CO2 changes. Over last couple of decades, as we have refined our knowledge of the CO2 record, these divergences have gone. The mismatches have mostly dissapeared.

Even just two years ago, Alley would have said something was still wrong in miocene, but today it looks better. Two years ago, we got new records that improve the match. Two weeks ago, Tripati et. al. published a new dataset that agrees even better. So, two years ago, miocene anomalies looked important, now not so clear, it looks like CO2 and temperature do track.

But what do we say to people who say the lag (CO2 rises tend to lag behind the temperature rise) proves current warming isn’t caused by CO2? We know that orbital changes (the Milankovitch cycles) kick off the ice ages – this was predicted 50 years before we had data (in the 1970s) to back it up. But temperature never goes far without the CO2, and vice versa, but sometimes one lags the other by about 2 centuries. And a big problem with the Milankovich cycles is that they only explain a small part of the temperature changes. The rest is when CO2 changes kick in. Alley offered the following analogy: credit card interest lags debt. By the denialist logic, because interest lags debt, then I never have to worry about interest and the credit card company can never get me. However, a simple numerical model demonstrates that interest can be a bigger cause of overall debt in the long run (even though it lags!!). So, it’s basic physics. The orbits initially kick off the warming, but the release of CO2 then kicks in and drives it.

So, CO2 explains almost all the historical temperature change. What’s left? Solar irradiance changes, volcanic changes. When these things change, we do see the change in the temperature record. For solar changes, there clearly aren’t many, and they act lik a fine tuning knob, rather than a major control. 40,000 years ago the magnetic field almost stopped (it weakened to about 10% of its current level), letting in huge amounts of cosmic rays, but the climate ignored it. Hence, we know cosmic rays are at best a fine tuning knob. Volcanic activity is important, but essentially random (“if volcanoes could get organised, they could rule the world” – luckily they aren’t organised). Occasionally several volcanoes erupting together makes a bigger change, but again a rare event. Space dust hasn’t changed much over time and there isn’t much of it (Alley’s deadpan delivery of this line raised a chuckle from the audience).

So, what about climate sensitivity (i.e. the amount of temperature change for each doubling of CO2)? Sensitivity from models matches the record well (approx 3°C per doubling of CO2). Recently, Royer et al conducted an interesting experiment, calculating equilibrium climate sensitivity from models, and then comparing with the proxy records, to demonstrate that climate sensitivity has been consistent over the last 420 million years. Hence paleoclimate says that the more extreme claims about sensitivity (especially those claiming very low levels) must be wrong.

In contrast, if CO2 doesn’t warm, then we have to explain why the physicists are stupid, and then we still have no other explanation for the observations. If there is a problem, it is that occasionally the world seems a little more sensitive to CO2 than the models say. There are lots of possible fine-tuning knobs that might explain these – and lots of current research looking into it. Oh, and this is a global story not a regional one; there are lots of local effects on regional climate.

Note that most of these recent discoveries haven’t percolated to the IPCC yet – much of this emerged in the last few years since the last IPCC report was produced. The current science says that CO2 is the most important driver of climate throughout the earth’s history.

Some questions from the audience followed:

  • Q: If we burn all the available fossil fuel reserves, where do we get to? A: If you burn it all at once, there is some chance of getting above the cretaceous level, but lots of uncertainty, including how much reserves there really are. In a “burn it all” future, it’s likely to get really hot: +6 to 7C.
  • Q: We know if we stop emitting, it will slow down global warming. But from geology, what do we know about removal? A: Anything that increases weatherabilty of rocks. But seems unlikely that we can make it go fast enough to make a difference, at an economic level. Key question is how much energy we would we need to do it, e.g. to dig up shells from ocean bed and allow them to carbonate. It’s almost certainly easier to keep it out of the air than to take it out of the air (big round of applause from the audience in response to this – clearly the 2,000+ scientists present know this is true, and appreciate it being stated so clearly).
  • Q What about feedbacks? A: As we put up more CO2, the oceans take up about half of it. As the world gets warmer, the ability of the ocean to buffer that reduces. Biggest concern is probably in changes in the arctic soils. Methane on the seafloor. As you go from decades to centuries, the sensitivity to CO2 goes up a little, because of these amplfying feedbacks.

Well that was it – a one hour summary of how we know that CO2 is implicated as the biggest driver in all climate change throughout the earth’s history. What I found fascinating about the talk was the way that Alley brought together multiple lines of evidence, and showed how our knowledge is built up from a variety of sources. Science really is fascinating when presented like this. BTW I should mention that Alley is author of The Two-Mile Time Machine, which I now have to go and read…

Finally, a disclaimer. I’m not an expert in any of this, by any stretch of the imagination. If I misunderstood any of Alley’s talk, please let me know in the comments.

Well, my intention to liveblog from interesting sessions is blown – the network connection in the meeting rooms is hopeless. One day, some conference will figure out how to provide reliable internet…

Yesterday I attended an interesting session in the afternoon on climate services. Much of the discussion was building on work done at the third World Climate Conference (WCC-3) in August, which set out to develop a framework for provision of climate services. These would play a role akin to local, regional and global weather forecasting services, but focussing on risk management and adaptation planning for the impacts of climate change. Most important is the emphasis on combining observation and monitoring services with research and modeling services (both of which already exist) with a new climate services information system (I assume this would be distributed across multiple agencies across the world) and system of user interfaces to deliver the information in forms needed for different audiences. Rasmus at RealClimate discusses some of the scientific challenges.

My concern in reading the outcomes of the WCC this is that it’s all focussed on a one-way flow of information, with insufficient attention to understanding who the different users would be, and what they really need. I needn’t have worried – the AGU session demonstrated that there’s plenty of people focussing on exactly this issue. I got the impression that there’s a massive international effort quietly putting in place the risk management and planning tools needed for a us to deal with the impacts of a rapidly changing climate, but which is completely ignored by a media still obsessed with the “is it happening?” pseudo-debate. The extent of this planning for expected impacts would make a much more compelling media story, and one that matters, on a local, scale to everyone.

Some highlights from the session:

Mark Svboda from the National Drought Mitigation Centre at the University of Nebraska, talking about drought planning the in US. He pointed out that drought tends to get ignored compared to other kinds of natural disasters (tornados, floods, hurricanes), presumably because it doesn’t happen within a daily news cycle. However drought dwarfs the damage costs in the US from all other kinds of natural disasters except hurricanes. One problem is that population growth has been highest in regions most subject ot drought, especially in the southwest US. The NDMC monitoring program includes the only repository of drought impacts. Their US drought monitor has been very successful, but next generation of tools need better sources of data on droughts, so they are working on adding a drought reporter, doing science outreach, working with kids, etc. Even more important, is improving the drought planning process, hence a series of workshops on drought management tools.

Tony Busalacchi from the Earth System Science Interdisciplinary Centre at the University of Maryland. Through a series of workshops in the CIRUN project, they’ve identified the need for tools for forecasting, especially around risks such as sea level rise. Especially the need for actionable information, but no service currently provides this. Climate information system needed for policymakers, on scales of seasons to decades, providing tailorable to regions, and with ability to explore “what-if” questions. To build this, needs coupling of models not used together before, and the synthesis of new datasets.

Robert Webb from NOAA, in Boulder, on experimental climate information services to support risk management. The key to risk assessment is to understand it’s across multiple timescales. Users of such services do not distinguish between weather and climate – they need to know about extreme weather events, and they need to know how such risks change over time. Climate change matters because of the impacts. Presenting the basic science and predictions of temperature change are irrelevant to most people – its the impacts that matter (His key quote: “It’s the impacts, stupid!”). Examples: water – droughts and floods, changes in snowpack, river stream flow, fire outlooks, and planning issues (urban, agriculture, health). He’s been working with the Climate Change and Western Water Group (CCAWWG) to develop a strategy on water management. How to get people to plan and adapt? The key is to get people to think in terms of scenarios rather than deterministic forecasts.

Guy Brasseur from German Climate Services Center, in Hamburg. German adaption strategy developed by german federal government, which appears to be way ahead of the US agencies in developing climate services. Guy emphasized the need for seamless prediction – need a uniform ensemble system to build from climate monitoring of recent past and present, and forward into the future, at different regional scales and timescales. Guy called for an Apollo-sized program to develop the infrastructure for this.

Kristen Averyt from the University of Colorado, talking about her “Climate services machine” (I need to get hold of the image for this – it was very nice). She’s been running workshops for Colorado-specific services, with beakout sessions focussed on impacts and utility of climate information. She presented some evaluations of the success of these workshop, including a climate literacy test they have developed. For example at one workshop, the attendees had 63% correct answers at the beginning (where the wrong answers tended to cluster and indicate some important misperceptions. I need to get hold of this – it sounds like an interesting test. Kristen’s main point was that these workshops play an important role in reaching out to people of all ages, including kids, and getting them to understand how climate change will affect them.

Overall, the main message of this session was that while there have been lots of advances in our understanding of climate, these are still not being used for planning and decision-making.

First proper day of the AGU conference, and I managed to get to the (free!) breakfast for Canadian members, which was so well attended that the food ran out early. Do I read this as a great showing for Canadians at AGU, or just that we’re easily tempted with free food?

Anyway, on to the first of three poster sessions we’re involved in this week. This first poster was on TracSnap, the tool that Ainsley and Sarah worked on over the summer:

Our tracSNAP poster for the AGU meeting. Click for fullsize.

Our tracSNAP poster for the AGU meeting. Click for fullsize.

The key idea in this project is that large teams of software developers find it hard to maintain an awareness of one another’s work, and cannot easily identify the appropriate experts for different sections of the software they are building. In our observations of how large climate models are built, we noticed it’s often hard to keep up to date with what changes other people are working on, and how those changes will affect things. TracSNAP builds on previous research that attempts to visualize the social network of a large software team (e.g. who talks to whom), and relate that to couplings between code modules that team members are working on. Information about the intra-team communication patterns (e.g. emails, chat sessions, bug reports, etc) can be extracted automatically from project repositories, as can information about dependencies in the code. TracSNAP extracts data automatically from the project repository to provide answers to questions such as “Who else recently worked on the module I am about to start editing?”, and “Who else should I talk to before starting a task?”. The tool extracts hidden connections in the software by examining modules that were checked into the repository together (even though they don’t necessarily refer to each other), and offers advice on how to approach key experts by identifying intermediaries in the social network. It’s still a very early prototype, but I think it has huge potential. Ainsley is continuing to work on evaluating it on some existing climate models, to check that we can pull out of the respositories the data we think we can.

The poster session we were in, “IN11D. Management and Dissemination of Earth and Space Science Models” seemed a little disappointing as there were only three posters (a fourth poster presenter hadn’t made it to the meeting). But what we lacked in quantity, we made up in quality. Next to my poster was David Bailey‘s: “The CCSM4 Sea Ice Component and the Challenges and Rewards of Community Modeling”. I was intrigued by the second part of his title, so we got chatting about this. Supporting a broader community in climate modeling has a cost, and we talked about how university labs just cannot afford this overhead. However, it also comes with a number of benefits, particularly the existence of a group of people from different backgrounds who all take on some ownership of model development, and can come together to develop a consensus on how the model should evolve. With the CCSM, most of this happens in face to face meetings, particularly the twice-yearly user meetings. We also talked a little about the challenges of integrating the CICE sea ice model from Los Alamos with CCSM, especially given that CICE is also used in the Hadley model. Making it work in both models required some careful thinking about the interface, and hence more focus on modularity. David also mentioned people are starting to use the term kernelization as a label for the process of taking physics routines and packaging them so that they can be interchanged more easily.

Dennis Shea‘s poster, “Processing Community Model Output: An Approach to Community Accessibility” was also interesting. To tackle the problem of making output from the CCSM more accessible to the broader CCSM community, the decision was taken to standardize on netCDF for the data, and to develop and support a standard data analysis toolset, based on the NCAR Command Language. NCAR runs regular workshops on the use of these data formats and tools, as part of it’s broader community support efforts (and of course, this illustrates David’s point about universities not being able to afford to provide such support efforts).

The missing poster also looked interesting: Charles Zender from UC Irvine, comparing climate modeling practices with open source software practices. Judging from his abstract, Charles makes many of the same observations we made in our CiSE paper, so I was looking forward to comparing notes with him. Next time, I guess.

Poster sessions at these meetings are both wonderful and frustrating. Wonderful because you can wonder down aisles of posters and very quickly sample a large slice of research, and chat to the poster owners in a freeform format (which is usually much better than sitting through a talk). Frustrating because poster owners don’t stay near their posters very long (I certainly didn’t – too much to see and do!), which means you get to note an interesting piece of work, and then never manage to track down the author to chat (and if you’re like me, you also forget to write down contact details for posters you noticed. However, I did manage to make notes on two to follow up on:

  • Joe Galewsky caught my attention with an provocative title: “Integrating atmospheric and surface process models: Why software engineering is like having weasels rip your flesh”
  • I briefly caught Brendan Billingsley of NSIDC as he was taking his poster down. It caught my eye because it was a reflection on software reuse in the  Searchlight tool.

This week I’ll be blogging from the American Geosciences Union (AGU) Fall Meeting, one of the biggest scientific meetings of the year for the climate science community (although the European EGU meeting, held in the spring, rivals it for size). About 16,000 geoscientists are expected to attend. The scope of the meeting is huge, taking in anything from space and planetary science, vulcanology, seismology, all the way through to science education and geoscience informatics. Climate change crops up a lot, most notably in the sessions on atmospheric sciences and global environmental change, but also in sessions on the cryosphere, ocean sciences, biogeosciences, paleoclimatology, and hydrology. There’ll be plenty of other people blogging the meeting, a twitter feed, and a series of press conferences. The meeting clashes with COP15 in Copenhagen, but scientists see Copenhagen as a purely political event, and would much rather be at a scientific meeting like the AGU. To try to bridge the two, the AGU has organised a 24/7 climate scientist hotline aimed at journalists and other participants in Copenhagen, although more on this initiative a little later…

Today, the meeting kicked off with some pre-conference workshops, most notably, a workshop on “Re-starting the climate conversation“, aimed at exploring the sociological factors in media coverage and public understanding of climate science. I picked up lots of interesting thoughts around communication of climate science from the three speakers, but the discussion sessions were rather disappointing. Seems like everyone recognises a problem in the huge gulf between what climate scientists themselves say and do, versus how climate science is portrayed in the public discourse. But nobody (at least among the scientific community) has any constructive ideas for how to fix this problem.

The first speaker was Max Boykoff, from University of Colorado-Boulder, talking about “Mass Media and the Cultural Politics of Climate Change”. Boykoff summarized the trends in media coverage of climate change, particularly the growth of the web as a source of news. A recent Pew study shows that in 2008, the internet overtook newspapers as people’s preferred source of news, although television still dominates both. And even though climate change was one of the two controversies dominating the blogosphere in the last two weeks, it still only accounts for about 1.5% of news coverage in 2009.

Boykoff’s research focusses more on the content, and reveals a tendency in the media to conflate the many issues related to climate change into just one question: whether increasing CO2 warms the planet. In the scientific community, there is a strong convergence of agreement on this question, in contrast to say, the diversity of opinion on whether the Kyoto protocol was a success. Yet the media coverage focusses almost exclusively on the former, and diverges wildly from scientific opinion. He showed a recent clip from NBC news, which frames the whole question in terms of a debate over whether it’s happening or not, with a meteorologist, geologist and sociologist discussing this question in a panel format. Boykoff’s studies showed 53% of news stories (through 2002) diverge from the scientific consensus, and, even more dramatically, 70% of TV segments (through 2004) diverge from this consensus. In a more recent study, he showed that the ‘quality’ newspapers (in the US and UK) show no divergence, while the tabloid newspapers significantly diverge in all sources. Worse still, much of this tabloid coverage is not just on the fence, but is explicitly denialist, with a common framing to present the issue as a feud between strong personalities (e.g. Gore vs. Palin).

Some contextual factors help explain these trends, including a shrinking technical capacity (specialist training) in newspaper/TV; the tendency for extreme weather events to drive coverage, which leads to the obvious framing of “is it or isn’t it caused by climate change?”; cultural values such as trust in scientists. Carvalho and Burgess present an interesting case study on the history of how the media and popular culture have shaped one another over issues such as climate change. The key point is that the idea of an information deficit is a myth – instead the media coverage is better explained as a complex cycle of cultural pressures. One of the biggest challenges for the media is how to cover a long, unfolding story within short news cycles. Which leads to an ebb and flow such as the following: In May 2008 the Observer ran a story entitled “Surging fatal shark attacks blamed on global warming“, although the content of the article is much more nuanced: some experts attribute it to global warming, others to increased human activites in the water. But then a subsequent story in the Guardian in Feb 2009 was “Sharks go hungry as tourists stay home“. Implication: the global warming problem is going away!

The second speaker was Matthew C. Nisbet, from the American University, Washington, perhaps best known for his blog, Framing Science. Nisbet began with a depressing summary of the downward trend in American concern over climate change, and the Pew study from Jan 2009 that showed global warming ranked last among 20 issues people regarded as top priorities for congress. The traditional assumption is that if the public is opposed, or does not accept the reality, the problem is one of ignorance, and hence scientific literacy is the antidote (and hence calls for more focus on formal science education and popular science outlets – “if we only had more Carl Sagans”). Along with this also goes the assumption that the science compels action in policy debates. However, Nisbet contends that when you oversimplify a policy debate as a matter of science, you create the incentive to distort the science, which is exactly what has happened over climate change.

This traditional view of a lack of science literacy has a number of problems. It doesn’t take into account the reality of audiences and how people make up their minds – essentially there is nothing unique about the public debate over climate change compared to other issues: people rely on information shortcuts – people tend to be “cognitive misers“. And it ignores the effects of media fragmentation: in 1985, most people (in the US) got their news from four main sources, the four main TV network news services. By contrast, in 2009, there are a huge array of sources of information, and because of our inability to take advantage of such a range, people have a tendency to rely on those that match their ideological preferences. On the internet, this problem of choice is greatly magnified.

Not surprisingly, Nisbet focussed mainly on the question of framing. People look for frames of reference that help them make sense of an issue, with little cognitive effort. Frames organise the central ideas on an issue, and make it relevant to the audience, so that it can be communicated by shorthand, such as catchphrases (“climategate”), cartoons, images, etc. Nisbet has a generalized typology of ways in which science issues get framed, and points out that you cannot avoid these frames in public discourse. E.g. Bill O’Reilly on Fox starts every show with “talking points”, which are the framing references for the likeminded audience; Many scientists who blog write with a clear liberal frame of reference, reflecting a tendency among scientists to identify themselves as more liberal than conservative.

The infamous Lunz memo contains some interesting framings: “the scientific debate remains open”, the “economic burden of environmental regulation”, and “the international fairness issue” – if countries like China and India not playing along, US shouldn’t make sacrifcies. In response, many scientists and commentators (e.g. Gore) have tended to frame around the potential for catastrophe (e.g. “the climate crisis”; “Be worried“). This is all essentially a threat appeal. The problem is that if you give an audience a threat, but no information on how to counter it, they either become fatalist or ignore it, and also, it opens the door to the counter-framing of calling people “alarmists”. This also plays into a wider narrative about the “liberal media” trying to take control.

Another framing is around the issue of public accountability and scientific evidence. For example, Mooney’s book “The Republican War on Science” itself became a framing device for liberals, which led to Obama’s “must restore science to its rightful place”. However, this framing can reinforce the signal that science is for democrats, not for republicans. Finally, “climategate” itself is a framing devices that flips the public accountability frame to one of accountability of scientists themselves – questioning their motivations. This has also been successfully coupled to the media bias frame, allowing the claim that the liberal media is not covering the alternative view.

So how do we overcome this framing? Nisbet concluded with examples of framings that reach out to difference kinds of audience. For example: EO Wilson’s book “The creation“, frames it as a religious/moral duty, specifically as a letter to a southern baptist. This framing helps to engage with evangelical audiences. Friedman frames it as a matter of economic growth – we need a price on carbon to stimulate innovation in a second American industrial revolution. Gore’s “We” campaign has been rebranded as “Repower America“. And the US Congress no longer refers to a “cap and trade bill”, but the American Clean Energy and Security Act (ACES).

Nisbet thinks that a powerful new frame is to talk about climate change as a matter of public health. The strategy is to shift the perception away from issues of remote regions (e.g. ice caps and polar bears) to focus instead on the impact in urban areas, especially on minorities, the elderly and children. Frame in terms of allergies, heatstress, etc. The lead author of the recent Lancet study, Anthony Costello, said that public health issues are under-reported, and need attention because they affect billions of people.

The recent Maibach and Leiserowitz study identifies six distinct audience groups, and changes ideas about where public perceptions are, especially the idea that the public is not concerned about climate change:

Figure 1

As an experiment, Nisbet and colleagues set up a tent on the National Mall in Washington, interviewed people who said they were from outside of DC, and categorized them in one of the six audiences. They then used this to identify a sample of each group to be invited to a focus group session, where they could test out the public health framing for climate change. Sentence by sentence analysis of their responses to a short essay on health issues and climate change proved very interesting, as there are specific sections, especially towards the end about policy options, where even the dismissives had positive responses. For example:

  • all six groups agreed “good health is a great blessing.
  • all six groups agreed to suggestions around making cities cleaner, easier to get around, etc.
  • 4 of the 6 groups found it helpful to learn about health threats of CC (which corresponds to a big majority of American audience)
  • All 6 groups reacted negatively to suggestions that we should make changes in diet and food choices, such as increasing vegetable and fruit and cutting down meat consumption. Hence, this is not a good topic to lead on!

In the question-and-answer session after Nisbet’s talk, there was an interesting debate about the lack of discussion in the media about how the science was obtained, concentrating on the results as if they came from a black box. Nisbet pointed out that this might not be helpful, and cited the fact that the climategate emails surprised many people with as how human scientists are (with egos and political goals) and the level of their uncertainty about the specific data analysis questions, which has made it a very successful framing device for those promoting the message that the science is weak. However, several audience members contended that this just means we need to do a better job of getting people to think like scientists, and bring the broader society more into the scientific process. While others pointed out how hard it is to get journalists to write about the scientific method, so the idea of partnering with others (e.g. religious leaders, health professionals) makes sense if it helps to identify and motivate particular audiences. This still leaves open the question of how to communicate uncertainty. For example, in the public health framing, people will still want to know does it affect, say, asthma or not. And as we’re still uncertain on this, it leads to the same problem as the shark attack story. So we still have to face the problem of how people understand (or not!) the scientific process.

The final speaker was Gwendolyn Blue, from the University of Calgary, Canada, talking about “Public Engagement with Climate Change”. Blue contends that major issues of science and technology require a participatory democracy that does not really exist yet (although is starting to appear). A portion of the lay public is distrustful of scientific institutions, so we need more effective ways of engaging the more diverse groups in the conversation.

Blue defines public engagement as “a diverse set of activities whereby non-experts become involved in agenda setting, decision-making, policy forming and knowledge production processes regarding science”. The aim is to overcome the traditional “one-way” transmission model of science communication, which tend to position lay audience as passive, and hence obsesses over whether they are getting good or bad information. But public understandings of science are much more sophisticated than many people give credit for, particularly away from questions of basic facts, such as the ethical and social implication of scientific findings.

There are clearly a number of degrees of citizen participation, for example Arnstein’s ‘ladder’. Blue is particularly interested in the upper rungs – i.e. not just ‘informing’ (which movements such as cafe scientifique, and public lectures try to do) but through engagements that aim  to empower and transform (e.g. citizen science, activism, protests, boycotts, buycotts). But she doesn’t think citizen participation in science is a virtue in it’s own right, as it can be difficult, frustrating, and can fail: success is highly context dependent.

Examples include: 350.org, which uses social networking media to bring people together. Lots of people creating images of the number 350 and uploading their efforts (but how many participants actually understand what the number 350 represents?); tcktcktck which collected 10 million signatures and delivered them to the COP15 negotiators. And the global day of action yesterday, which was one of the biggest demonstrations of public activism. However, this type of activism tends to be bound up with social identity politics.

Deliberative events aim to overcome this by bringing together people who don’t necessarily share the same background / assumptions. Blue described her experiences with one particular iniative, the World Wide Views on Global Warming events on September 26, 2009. The idea grew out of the Danish model of consensus politics. Randomly selected participants in each country, were offered free trip to a workshop (in Canada, it took place in Calgary), with some effort to select approximately 100 people representing the demographic makeup of each country. The aim was to discuss the policy context for COP15. There were (deliberately) no “experts” in the room, to remove the inequality of experts vs. lay audience. Instead, a background document was circulated in advance, along with a short video. Clear ground rules were set for good dialogue, with a trained facilitator for each table. Used cultural activities too: e.g. at the Canadian event, they used music and dance from across Canada, Inuit throat singers, opening prayer by Blackfoot Elder, etc.

The result was a fascinating attempt to build an engaged public conversation around climate change and the decision making process we face. A number of interesting themes emerged. For example, despite a very diverse set of participants, lots of common ground emerged, which surprised many participants, especially around scale/urgency of the problem, and the overall goals of society. A lot of social learning took place – many participants knew very little about climate science at the outset. However, Blue did note that success for these events requires scientific literacy and well as civil literacy in the participants, reflexivity, humility and willingness to learn. But it is part of a broader cultural shift towards understanding the potential and limitations of participatory democracy.

The results were reported in a policy report, but also, more interestingly, on a website that allows you to compare results from different countries. Much of the workshops were about public deliberation, which can be very unruly. But at the end this was distilled down to an opinion poll with simple multiple choice questions to communicate the results.

The discussion towards the end of the workshop focussed on how much researchers should be involved in outreach activities. It is not obvious who should be doing this work. There is no motivation for scientists to do it, and lots of negatives – we get into trouble, and our institutions don’t like it; it doesn’t do anything for your career in most cases. And several of the speakers at the workshop described strategies in which there doesn’t seem to be a role for climate scientists themselves. Instead, the work seems to point the need for a new kind of “outreach professional” who is both scientific expert and trained in outreach activities.

Which brings me back to that experiment I mentioned at the top of the post, on the AGU providing a 24 hour hotline for COP15 participants to talk directly with small groups of climate scientists. It turns out the idea has been a bit of a failure, and has been scaled back due to a lack of demand. Perhaps this has something to do with the narrow terms that were set for what kinds of questions people could ask of the scientists. The basic science is not what matters in Copenhagen. Which means the distance between Copenhagen and San Francisco this week is even greater than I thought it would be.

[Update: Michael Tobis has a much more critical account of this workshop]

Our group had three posters accepted for presentation at the upcoming AGU Fall Meeting. As the scientific program doesn’t seem to be amenable to linking, here are the abstracts in full:

Poster Session IN11D. Management and Dissemination of Earth and Space Science Models (Monday Dec 14, 2009, 8am – 12:20pm)

Fostering Team Awareness in Earth System Modeling Communities

S. M. Easterbrook; A. Lawson; and S. Strong
Computer Science, University of Toronto, Toronto, ON, Canada.

Existing Global Climate Models are typically managed and controlled at a single site, with varied levels of participation by scientists outside the core lab. As these models evolve to encompass a wider set of earth systems, this central control of the modeling effort becomes a bottleneck. But such models cannot evolve to become fully distributed open source projects unless they address the imbalance in the availability of communication channels: scientists at the core site have access to regular face-to-face communication with one another, while those at remote sites have access to only a subset of these conversations – e.g. formally scheduled teleconferences and user meetings. Because of this imbalance, critical decision making can be hidden from many participants, their code contributions can interact in unanticipated ways, and the community loses awareness of who knows what. We have documented some of these problems in a field study at one climate modeling centre, and started to develop tools to overcome these problems. We report on one such tool, TracSNAP, which analyzes the social network of the scientists contributing code to the model by extracting the data in an existing project code repository. The tool presents the results of this analysis to modelers and model users in a number of ways: recommendation for who has expertise on particular code modules, suggestions for code sections that are related to files being worked on, and visualizations of team communication patterns. The tool is currently available as a plugin for the Trac bug tracking system.

Poster Session IN31B. Emerging Issues in e-Science: Collaboration, Provenance, and the Ethics of Data (Wednesday Dec 16, 2009, 8am – 12:20pm)

Identifying Communication Barriers to Scientific Collaboration

A. M. Grubb; and S. M. Easterbrook
Computer Science, University of Toronto, Toronto, ON, Canada.

The lack of availability of the majority of scientific artifacts reduces credibility and discourages collaboration. Some scientists have begun to advocate for reproducibility, open science, and computational provenance to address this problem, but there is no consolidated effort within the scientific community. There does not appear to be any consensus yet on the goals of an open science effort, and little understanding of the barriers. Hence we need to understand the views of the key stakeholders – the scientists who create and use these artifacts.

The goal of our research is to establish a baseline and categorize the views of experimental scientists on the topics of reproducibility, credibility, scooping, data sharing, results sharing, and the effectiveness of the peer review process. We gathered the opinions of scientists on these issues through a formal questionnaire and analyzed their responses by topic.

We found that scientists see a provenance problem in their communications with the public. For example, results are published separately from supporting evidence and detailed analysis. Furthermore, although scientists are enthusiastic about collaborating and openly sharing their data, they do not do so out of fear of being scooped. We discuss these serious challenges for the reproducibility, open science, and computational provenance movements.

Poster Session GC41A. Methodologies of Climate Model Confirmation and Interpretation (Thursday Dec 17, 2009, 8am – 12:20pm)

On the software quality of climate models

J. Pipitone; and S. Easterbrook
Computer Science, University of Toronto, Toronto, ON, Canada.

A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated directly. Thus, in order to trust a climate model one must trust that the software it is built from is robust. Our study explores the nature of software quality in the context of climate modelling: How do we characterise and assess the quality of climate modelling software? We use two major research strategies: (1) analysis of defect densities of leading global climate models and (2) semi-structured interviews with researchers from several climate modelling centres. Defect density analysis is an established software engineering technique for studying software quality. We collected our defect data from bug tracking systems, version control repository comments, and from static analysis of the source code. As a result of our analysis, we characterise common defect types found in climate model software and we identify the software quality factors that are relevant for climate scientists. We also provide a roadmap to achieve proper benchmarks for climate model software quality, and we discuss the implications of our findings for the assessment of climate model software trustworthiness.

I’ve just been browsing the sessions for the AGU Fall Meeting, to be held in San Francisco in December. Abstracts are due by September 3. The following sessions caught my attention:

Plus some sessions that sound generally interesting: