I missed out on liveblogging the last session on Tuesday on Seamless approaches in weather and climate, because the room had no power outlets at all, and my battery died. Which is a shame, as it was very interesting. The aim of seamless assessment is to be able to move back and forth between weather forecast models and climate models.

The last speaker in the session, Randall Dole, give a good explanation for the reasons why this is an emerging priority, with his three challenges:

  • Understanding and modeling organized tropical convection and its global impacts. This is a key problem in predictability of weather beyond about a week, and a major factor in the regional differences in climate variations within the overall climate change trends.
  • Predicting weather and climate extremes in a changing climate (e.g. tropical cyclones, floods, droughts, coastal inundation, etc)
  • Integrating earth system models and observations. Or: how to build a scientifically-based, internally consistent record of how the earth system is evolving over time.

Randall also identified an opportunity to provide better information for energy and climate policy, for example to assess the likely unintended consequences of major new energy projects, geo-engineering proposals, etc.

David Williamson from NCAR described the Transpose-AMIP project, which takes a set of models (both numerical weather prediction (NWP) and climate models) and runs them against a benchmark of 2 months worth of real weather observations. The aim is to analyze the primary errors, and is especially useful for comparing parameterization schemes with the field data, to track down which parameterizations cause which forecast errors. The NWP models did dramatically better than the climate models, but probably because they are highly tuned to give high quality forecasts of things like precipitation.

Keith Williams from the UK Met Office Hadley Centre talked about progress on a new initiative there on seamless assessment. The aim is to get all of the Met Office models to use the same physics schemes, from the 2-day weather forecast model all the way to the centennial and regional climate models. (Except in cases where it is scientifically justifiable to use an alternative scheme). Work by Rodwell and Palmer paved the way for this. One of the big challenges is to predict extreme events (e.g. heavy storms and flash floods) under climate change. Keith demonstrated why this is hard, with an example of a particular flood in North Cornwall, which is only predicted by high resolution weather forecast models on a 1.5km grid, and not by climate models working on a 20km grid). The problem is we can’t say anything about the frequency of such events under future climate change scenarios if the models don’t capture them.

Frank Selten gave a talk on EC-Earth, a project aimed at extending the ECMWF weather forecast model, currently rated as the best in the world for medium range weather forecasts, and creating a longer range climate model. Interestingly, the plan is to synchronize this effort with the ECMWF’s seasonal model, rather than forking the code. [Note: In conversations after the talk, we speculated on what software engineering problems they might encounter with this, given that the two will be developed at different sites in different countries. My work at the Met Office suggested that a major factor in their success at keeping the weather and climate models integrated is that everything is done in a single building at a single site. EC-Earth might make a good case study for me]. Oh, and they’ll be using the climate prediction index introduced by Murphy et al to assess progress.

Finally, Prashant Sardeshmukh blew my mind, with his description of the twentieth century reanalysis project. The aim of this project is to recreate an entire record of 6-hour measurements of near-surface and tropospheric temperatures, extending back to the start of the 20th century, using all the available observational data and a 56-model ensemble of climate models. Once they’ve done that they plan to go all the way back to 1871. They do this by iteratively improving the estimates until the models and the available field data converge. I amused myself by speculating whether it would be easier to invent a time machine and send a satellite back in time to take the measurements instead…

Not much to report from this morning, but here’s a few interesting talks from this afternoon:

15:30: Dick Schaap, speaking about SeaDataNet. Another big European project: 49 partners and 40 data centres. Most of the effort focusses on establishing standard data formats and metadata descriptions. The aim is to collect all the data providers into a federated system, with single portal, with a shopping basket for users to search for data they need, and secure access to them through a single sign-on. Oh, and they use Ocean Data View (ODV) for interactive exploration and visualization.

15:45: Roy Lowry, of the British Oceanographic Data Centre, whose talk is A RESTful way to manage ontologies. He covered some of the recent history of the NERC Datagrid, and some of the current challenges. 100,000 concepts, organised into about 100 collections. Key idea was to give each concept its own URN throughout the data and metadata, with a resolving service to get URLs from URNs. URLs instantiated as SKOS documents. Key issues:

  • Versioning – if you embed version numbers in the URNs, you have many URNs per concept. So the lesson is to define the URN syntax so that it doesn’t include anything that varies over time. 
  • Deprecation – you can deprecate conecpts by moving the collection, so that the URN now refers to the replacement. But that means the URN of the deprecated concept changes. Lesson: deprecation implemented by change of status, rather than change of address.
  • WSDL structure – RDF triples are implemented as complex types in WSDL. So adding new relationships requires a change in the WSDL, and changing the WSDL during operation breaks the system. 

Oh, and this project supports several climate science initiatives: the Climate Science Modeling Language, and, of course, Metafor.

16:05: Massimo Santoro, on SeaDataNet interoperability, but I’m still too busy exploring the NDG website to pay much attention. Oh, this one’s interesting: Data Mashups based on Google Earth.

16:45: Oh, darn, I’ve missed Fred Spilhaus’s lecture on Boundless Science. Fred was executive director of the AGU for 39 years, until he retired last year. I’m in the wrong room, and tried the webstreaming, but of course it didn’t work. Curse this technology…

17:30: Now for something completely different: Geoengineering. Jason Blackstock, talking on Climate Engineering Responses to Climate Emergencies. Given that climate emergencies are possible, we need to know as much as possible about possible “plan B”s. Jason’s talk is about the outcome of a workshop last year, to investigate what research would be needed to understand the effects of geoengineering. They ignored the basic “should we” question, along with questions on whether consideration of geoengineering approaches might undercut efforts to reduce GHG emissions.

Here’s the premise: we cannot rule out the possibility that the planet is “twitchy“, and might respond suddenly and irreversibly to tipping points. In which case we might need some emergency responses to cool the planet again. Two basic categories of geogengineering – remove CO2 (which is likely to be very slow), or increase the albedo of the earth just a little bit (which could be very fast). The latter options are the most plausible. The most realistic of these are cloud whitening and stratospheric aerosols, so that’s what the workshop focussed on. We know aerosols can work fast because of the data from the eruption of Mt Pinatubo. Ken Caldeira and Lowell Wood did some initial modeling that demonstrated how geoengineering through aerosols might work.

But there are major uncertainties: transient vs. equilibrium response. Controllability and reversability; ocean acidification continues unaffected; plus we don’t know about regional effects, and effects on weather systems. Cost is not really an issue: $10B – $100B per year. But how do we minimize the potential for unanticipated consequences? 

  • Engineering questions: which aerosols? Most likely sulphates. How and where to deploy them? Lots of options.
  • Climate science questions: What climate parameters will be affected by the intervention? What would we need to monitor? We need a ‘red team’ of scientists on hand to calculate the effects, and assess different options. 
  • Climate monitoring: what to we need to measure, with what precision, coverage, and duration, to keep track of how the deployment proceeding?

If we need to be ready with good answers in a ten-year timeframe, what research needs to be done to get there? Phase 1: Non-intervention research. Big issues: hard to learn much without intervention. Phase II: field experiments. Big issues: can’t learn much from a small ‘poke’; need to understand scaling. Phase III: Monitored deployment.

Non-technical issues: What are sensible trigger conditions? Who should decide whether to even undertake this research? Ethics of field tests? Dealing with winners and losers from deployment. And of course the risk of ‘rogue’ geoengineering efforts.

Takehome messages: research into geoengineering responses is no longer “all or nothing” – there are incremental efforts that can be undertaken now. Development of an ‘on the shelf’ plan B option requires a comprehensive and integrated research program – this is a 10-year research program at least.

Some questions: How would this affect acid rain? Not much, because we’re talking about something of the order of 1% of our global output of sulphurous aerosols, plus problems of acid rain are reducing steadily anyway. A more worrying concern would be effect on the tropospheric ozone.

Who decides? There are some scientists saying already we’ve reached a climate emergency. If the aim is to avoid dangerous tipping points (e.g. melting of the poles, destruction of the rainforests), at what point do we pull the trigger? No good answer to this one.

Read more: Journal special issue on geo-engineering.

Chris Jones, from the UK Met Office Hadley Centre, presented a paper at EGU 2009 yesterday on The Trillionth Tonne. The analysis shows that the key driver of temperature change is the total cumulative amount of carbon emissions. To keep below the 2°C global average temperature rise generally regarded as the threshold for preventing dangerous warming, we need to keep total cumulative emissions below a trillion tonnes. And the world is already halfway there.

Which is why the latest news about Canada’s carbon emissions are so embarrassing. Canada is now top among the G8 nations for emissions growth. Let’s look at the numbers: 747 megatonnes in 2007, up from 592 megatonnes in 1990. Using the figures in the Environment Canada report, I calculated the Canada has emitted over 12 gigatonnes since 1990. That’s 12 billion tonnes. So, in 17 years we burnt though more than 1.2% of the entire world’s total budget of carbon emissions. A total budget that has to last from the dawn of industrialization to the point at which the whole world become carbon-neutral. Oh, and Canada has 0.5% of the world’s population.

Disclaimer: I have to check whether the Hadley Centre’s target is 1 trillion tonnes of CO2-equivalent, or 1 trillion tonnes of Carbon (they are different!). The EnvCanada report numbers refer to the former.

Update: I checked with Chris, and as I feared, I got the wrong units – it’s a trillion tonnes of carbon. The conversion factor is about 3.66, so that gives us about 3.66 trillion tonnes of carbon dioxide to play with. [Note: Emissions targets are usually phrased in terms of “Carbon dioxide equivalent”, which is a bit hard to calculate as different greenhouse gases have both different molecular weights and different warming factors].

So my revised figures are that Canada burnt through only about 0.33% of the world’s total budget in the last 17 years. Which looks a little better, until you consider:

  • by population, that’s 2/3 of Canada’s entire share. 
  • Using the cumulative totals from 1900-2002. plus the figures for the more recent years from the Environment Canada report (and assuming 2008 was similar to 2007) we’ve emitted 27 gigatonnes of CO2 since 1900. Which is about 0.73% of the world’s budget, or about 147% of our fair share per head. 
  • By population, our fair share of the world’s budget is about 18 gigatonnes CO2 (=5 gigatonnes Carbon). We’d burnt through that by 1997. Everything since then is someone else’s share.