I have a couple of PhD theses that I have to read this week, so I probably won’t get to read the new Synthesis Report from the Copenhagen Conference for a few days. But it looks like it will be worth the wait. According to RealClimate, it’s a thorough look at how much the science has changed since the IPCC AR4 report in 2007, along with some serious discussion on what needs to be done to hold the world below the threshold of 2°C of warming.
The US government has launched a very swanky new inter-agency website, www.globalchange.gov, intended to act as a central point for news and reports on climate change. And the centre-piece of the launch is a major new report on Climate Change Impacts in the US. It draws on the IPCC reports for much of the science, but it adds to that a lot of detail on how the impacts of climate change are already being seen across the US, and how much worse it’s going to get. Definitely worth reading, especially the parts on drought and heat waves. (The brochures and factsheets are good too)
Add to that a recent report from CNA on Risks to (US) National Security from energy dependence and climate change. It includes some fascinating snippets from senior retired US military admirals and generals, including for example, the threat to the US’s military capacity posed by sea level rise (e.g loss of coastal installations), and disruption to energy supplies. And that the biggest cause of conflict in the 21st century is likely to be mass migrations of entire populations, seeking water and food as supplies dwindle (but if you’ve read Climate Wars, you already knew this)
Well, I had a fabulous week at the EGU. I tried to take in many different aspects of climate research, but inevitably ended up at lots of sessions on earth systems informatics (to satisfy my techie streak), and sessions looking at current cutting edge research on earth systems models, such as integrating weather forecast and climate models, model ensembles, and probabilistic predictions. Lots of interesting things going on in this space.
Here’s what I would regard as the major themes of the conference from my perspective:
- Ocean Acidification. It’s pretty easy to predict because it’s linear in the concentration of CO2 in the atmosphere – i.e. there’s no uncertainty at all. When we kill off life in the seas we also lose a major carbon sink.
- Feedbacks. I learned at least nine different definitions of the word feedback, and also that there are a huge number of feedbacks that we might want to put into an earth system model, so someone’s got to work out which ones are most likely to be important.
- Abrupt Climate Change. I learned that the paleontological record tells us that the earth is quite likely to be twitchy, and we still don’t know anywhere near enough about the triggers. Oh, and lots of climate scientists think we’ve already hit some of those triggers.
- Probabilistic forecasting. I learned a lot about the use of model ensembles (both multi-models, and perturbed physics experiments with single models) to quantify our uncertainties. There’s a strong move in the climate community to replace single predictions of climate change with probabilistic forecasts. The simplest exposition of this idea is MIT’s wheels of fortune.
- Simpler targets for policy makers. I’m very taken with the analysis from Chris Jones and colleagues that show that if we want to stay below the 2°C temperature rise, we have a total budget of One Trillion Tonnes of Carbon to emit, and since the dawn of industrialization, we used up more than half of it.
- Geo-Engineering. Suddenly it’s okay for climate scientists to start talking about geo-engineering. For years, this has been anathema, on the basis that even just talking about this possibility can undermine the efforts to reduce carbon emissions (which is always the most sensible way to tackle the problem). But now it appears that many scientists have concluded that it’s too late anyway to do the right thing, and now we have to start thinking the unthinkable.
Plus some things that I missed that I wish I’d seen (based on what others told me afterwards):
- Stefan Rahmstorf‘s views on the risks of an abrupt shift in the ocean circulation. Still, I consoled myself with this webcast from one of his talks last year: A greater risk of sea level rise? (Stefan writes great stuff: this primer from 2004 on the risk of abrupt change is fabulous).
- The session on The Cryosphere – for how much longer?, particularly fellow Canadian Luke Copland‘s talk on accelerated collapse of the arctic ice shelves; I’m told this was both spectacular and scary stuff.
Okay, slow day today, as I took some time out to get my talk ready for this afternoon. In the meantime, it gives me a chance to reflect on a few ideas. For example, I’ve seen many talks this week on data sharing, and had some very juicy discussion over dinner last night with Bryan. Two key challenges seem to stand out in this work: (1) the lack of shared ontology between scientific sub-communities who want to share datasets, and (2) the inevitable separation of data from commentaries on that data (which includes problems of knowledge provenance, and meta-data management). The latter problem is endemic because the data sources and the commentary sources are in different communities, operate asynchronously, and often have very different goals.
There seem to be plenty of people considering the technical aspects of these problems, exploring the use of ontology description languages (e.g. OWL and its relatives), and markup languages (e.g. the many application schemas of GML). But very little emphasis has been placed on the human side of this problem. First, the sociological problem of ontological drift has been ignored – the problem that objects gradually change their meaning as they pass between difference communities. (Chalmers gives an overview of various responses to this problem).
A key problem here is that scientific communities have collaborated reasonably effectively in previous centuries through the use of boundary objects (first characterized by Leigh Star). Boundary Objects are…
… both plastic enough to adapt to local needs and constraints of the several parties employing them, yet robust enough to maintain a common identity across sites. They are weakly structured in common use, and become strongly structured in individual-site use. They may be abstract or concrete. They have different meanings in different social worlds but their structure is common enough to more than one world to make them recognizable means of translation. [Wikipedia defn]
Examples include taxonomies, maps, scientific methods, etc. I talked a little with Leigh about boundary objects in the early 90’s, and I observed that boundary objects are very effective because they represent a minimal shared understanding – just enough so that different communities have some common frames of reference, but no so much than anyone has to work hard on ensuring they have the same mental models.
The problem is that boundary objects fool us into thinking that we share much more meaning than we really do. When we try and embed our boundary objects into computer systems (thinking that we have a shared semantics), we bind the boundary objects to a particular interpretation, and hence they lose their plasticity. As soon as we do this, they are no longer boundary objects. They are brittle reflections of the real boundary objects. Whereas before, the objects themselves could be adapted to each communities needs, now the communities themselves must do the adapting, to the fixed definitions of these frozen objects. No wonder everyone finds this hard!
So what do we do? I don’t really know, but I have some ideas. Lots of small local ontologies, with loose flexible mappings, between them created on the fly by the communities that use them, social tagging style? Or heavier weight tools from psychology for teasing out mappings between terminologies and concepts used by different expert communities, such as repertory grids (e.g. Shaw and Gaines developed a technique applying Rep Grids for exploring conflicting terminology). Or perhaps more flexible ontology languages built over paraconsistent logics? I’ve played with all these ideas in the past, and think they all have some value. How to exploit them in practical data sharing systems remains a big open question.
8:30: This morning I’m at a session on education. Just my luck – I got here towards the end of the first talk, which looked very interesting. It was Zoe Robinson, from Keele, giving a talk entitled Effective and responsible teaching of climate change in Earth Science-related disciplines. She talked about many of the dilemmas faced in teaching of the earth sciences – e.g. attitudes of students such as climate skepticism, feelings of saturation coverage, etc., coupled with flawed knowledge. At the same time, accurate coverage of the complexity of the earth systems and areas of uncertainty can undermine the sense of urgency and responsibility appropriate for dealing with the challenges of climate change. Zoe talked about non-traditional techniques to use in the classroom to deal with these issues (but I missed most of it!)
8:45: Ilona Pojtok-Tari, Geography Netquipment (project webpage)
9:00: Marina Manea on Online Geodynamics. Talked about how they teach computational geodynamics. using various online modeling tools. The animations on convection in the earth’s mantle look pretty impressive, although the server is pretty slow.
The next two talks picked up a on common theme: how increased time pressure and reduced classroom time are eroding competence in students. Both talked about using other forms of instruction to combat this.
9:15: Stefan Winkler, talking about Geographic-didactical games as interactive tools. They noticed an emerging problem with lack of basic knowledge in physical geography, caused by reducing the time available for practice exercises, and a change in the timetable that separated lecture material from practical classes. So started experimenting with various forms of “edutainment” in evening sessions, or during fieldtrips and weekend seminars. These include quizes, a memory game, The students liked the games, and they clearly improved both motivation and knowledge.
9:30: Michael Mayer, talking about Using competence-based and project-related approaches to support students individually. He’s especially concerned about competencies such as critical thinking, work organization, teamworking, and so on. Aha! His pedagogical approach is based on constructivism. The students have to develop personal portfolios, and get regular email feedback on their individual learning style and selection of topics (plus regular grading to provide extrinsic motivation). Of course, it’s very time-consuming for the teacher. The seminars are based on development of mindmaps.
9:45: Varyl Thorndycraft, on use of Google Earth and virtual fieldwork in teaching physical geography. There’s plenty of evidence on the value of fieldwork in teaching physical geography, but limitations (cost, safety, time, etc) on how much is possible. Varyl suggests that virtual fieldwork can supplement (but not replace) real fieldwork. Google Earth provides the ability to measure distance and elevation, to visualize spatial scales by zooming, and visualize landscapes via the tilt. But some of the height measurements are inaccurate, and there’s no control over when images were taken (and they may change from time to time, which can affect planning for virtual fieldwork). He demonstrated some uses of google earth to cover a wide range of topics, where real fieldwork would not be possible.
After some coffee (and a complaint from the person sitting next to me during that session that my keyboard is too loud!), we have this:
10:30: Thomas Stocker, from the University of Bern, and co-chair of the IPCC working group I, receiving the Hans Oeschger Medal. His talk picks up on the idea of the earth being “twitchy” that I mentioned yesterday. It’s entitled From Abrupt Change to the Future. Hmm – big audience for this one. Good job I got here early, and that my battery is fully charged, as it’s that room with no power outlets again. Okay, here we go. First a little retrospective on the work on Hans Oeschger, from his early work on ice cores to early models of CO2 exchange in the atmosphere. In 1978, long before Kyoto, Hans was publishing analysis of stabilization of CO2, including that CO2 emissions needed to peak shortly after the year 2000, and then drop dramatically, in order to stabilize concentrations in the atmosphere. The essential knowledge was available back in 1978 about what the challenge was and what we needed to do, and (looking at the Keeling curve), there’s been no progress since then on addressing the challenge. In 1984, from greenland ice cores, they demonstrated that there was a consistent signal of abrupt climate changes from different locations.
In 1984, Broeker developed a famous visualization of the ocean conveyer belt. The key idea is that the North Atlantic cools the southern hemisphere. Through the 1990’s there was a debate because the paleoclimactic record did not support the theory that the Northern and Southern hemispheres were linked in this way. Then in 2003, after reanalysis of the data, the connection between Greenland and Antarctic warming became clear, and a new very simple model of these processes was developed. This led to a much better understanding of abrupt climate change. Lessons from this story: be bold (but not too often), be persistent, and be open and critical – sometimes have to re-think the models, sometimes have to re-analyze the data to make sense of (apparently) conflicting evidence.
1987, Broeker (again) published a paper in Nature on “unpleasant surprises in the greenhouse“. Led in the 1990s to an analysis of thresholds in the more complex 3D climate models. But even in 2007, the models are all over the place in analyzing these thresholds. So how do we know what kinds of abrupt changes are in store, and whether these are irreversible? For example, plenty of people think we have already passed a threshold for arctic sea ice melting. Many regions of the world are likely to experience irreversible drought (from Solomon et al 2009).
So, to finish, some open questions: where is the geographical and physical origin of abrupt climate change in the past? What are the triggers? What are the rates of change? what are the impacts of these changes on water cycle, etc.? Are there thresholds in the climate system, and what finding in this space are robust? (if we’re too bold here it will hurt our credibility). What controls potential instabilities in the ice sheets?
11:55: Change of sessions again, to the one on Climate Prediction: Models, Diagnostics, and Uncertainty Analysis. I missed this morning’s session on model diagnostics, but just arrived in time to catch Hermann Held talking about the Effects of climate uncertainties on welfare optimal investment streams into mitigation technologies. The problem: society has to invest in mitigation strategies, but under uncertain conditions. This makes the economics assessment of welfare optimization complicated. Economists tend to ask what it would cost to constrain global temperature rise to 2°C, and discount the future costs, under the assumption that mitigation strategies impact economic growth by constraining energy use. Uncertainty over climate sensitivity has a big impact on economically optimal emissions path, which means you can generate almost any answer to the economics question by choosing a different figure for sensitivity. The study design is interesting – here’s the paper,
13:30: After lunch, the topic of the session switches to uncertainty in projections and impacts. First up: Ronald Prinn, from MIT, talking on Probabilistic Forecast for 21st Century Climate Based on Uncertainties in Emissions. Based on an interesting model (IGSM) that couples climate simulation with economics concerns, and performs uncertainty analysis on the economics aspects, especially the growth of emissions under different climate policies. Lots of interesting probability analyses – too much for me to summarize, but here’s the paper. Oh, and he showed the ‘”wheel of fortune” I mentioned in my last post.
13:45: Andrei Sokolov, also from MIT, talking on Sensitivity of climate change projections to uncertainties in the estimates of observed changes in deep-ocean heat content. Too much detail in this talk for me to follow…
14:00: Ana Lopez, from the London School of Economics. From climate model ensembles to climate change impacts. The environment agency in the south west of England is interested in exploring the added value of ensembles of climate models in their everyday decision making, especially around water resource management. Used a perturbed physics ensemble from climateprediction.net (which is based on HadCM3), and a 21 model ensemble from CMIP3. Need to do some downscaling to get useful results to input to an impact model to make decisions about adaptation. (here’s an early paper from this work)
14:20: Next up is Stefan Fronzek’s talk on Probabilistic projections of climate change effects on sub-arctic palsa mires, in which I learnt that palsa mires are permanently frozen peat hummocks, which are sensitive to climate change, and if they melt are a major source of methane. Again, using perturbed physics ensembles from the Hadley Centre, the idea is to get probability density functions from the models to use for impact assessment. Sample result: risk of total loss of palsa mires estimated to be 80% by 2100 under scenario A1B.
14:35: Roberto Ferrise, talking on Climate Change and Projected Impacts in Agriculture: an Example on Mediterranean Crops. Starts by setting risk thresholds for minimum crop yield. Then takes climate change probability functions (again, using the Hadley Centre’s joint probabilities), and uses these as input to a crop growth model. Here, you have to take the effect of increase CO2 (which stimulates crop growth), decreased precipitation (drought!) and increased temperature (both of which inhibit growth). The impact is very different on different crop types, and different regions across southern Europe, and is also different over time, depending on which of the various factors dominate. For example for durum wheat, we see increased yield for the next few decades, followed by loss of almost all productivity by mid century. For grapevines, the risk of lower productivity drops everywhere except in northern Italy and Northern Greece, where the risk continually rises. One of the questioners asked about uncertainty in the crop models themselves, as opposed to uncertainty in the climate. This is an interesting question, because the analysis of uncertainty in climate models looks like it is a far more mature field than analysis of uncertainty in other types of model.
14:50: Last talk in this session is by Nathan Urban on Probabilistic hindcasts and projections of the coupled climate, carbon cycle, and Atlantic meridional overturning circulation systems. Uses some simple box models (of MOC, Climate, and Carbon cycle) coupled together, to assess the probability of experiencing an MOC collapse within the next 200 years vs. triggering a collapse (i.e. setting up conditions such that a collapse will occur by 2300). The good news is that the probability of experiencing a collapse within the next 100 years remains less than 10% (consistent with the IPCC), but rises rapidly after 2100. However the probability of triggering (committing to) a collapse is about 25% in 2100 and 75% by 2200. Of course the results depend on emissions scenario used, and this is only a relatively simple climate model. (Here’s the paper).
Quick chat to the MIT folks in the coffee break to understand their development processes, especially how they cope with such a multidisciplinary team contributing to the models. Key to success (just like at Hadley) appears to be to get everyone in the same building. They do have one additional challenge which is a collaboration with the folks at Woods Hole, but as that’s only 60 miles, the Woods Hole folks make it up to weekly coordination meetings at MIT.
15:30: Okay, downstairs, to the session on Information and services technologies for Earth and Space Sciences. First speaker: Bryan Lawrence, of BADC, talking about the Moles project (Metadata Objects for Linking the Environmental Sciences). He started with a quick history of the NERC Data Grid (which includes lots of cute acronyms: COWS, MILK, MOLES). Moles: the core goal is to support users thoughout the process, where they start by trying to find data, browse if they don’t know what they’re looking for, and then manipulate the data they find in different ways. Characterize the different types of metadata involved. (quote: “we would never try to integrate discipline-specific metadata”). Teleporting & orienteering – jump to something that’s close to the kind of data you want, and then move around in the local space to find it. Define the metadata that’s common across disciplines (he walked us through some example schemas). Using a GML application schema is clearly not a sufficient condition for interoperability, and only maybe is a necessary condition, but plenty of evidence that it helps. (lots more, but he talks faster than I can type…phew.)
15:45: John Laxton, talking about GeoSciML v2: an interchange and mark-up language for geologic information. GeoSciML is a GML application schema for geology information. As Bryan said, GMLs don’t give you everything: to genuinely exchange information you need to look at the semantics. V2.0 was released at the AGU fall meeting in 2008. He’s walking through some example schemas, e.g. for Geological Feature; and Geological Unit, Geologic Structure. (and here’s a paper). Oh, and I forgot that this is supporting the OneGeology project I mentioned the other day.
16:00: The next few papers were also on various Markup Languages: Marc Löwner, talked about GeoGML, including the challenges in developing the UML models for geological morphology, capturing both structural and process relationships between concepts. Ilya Zaslavsky talked about the CUAHSI Water Markup Language (WaterML); and Todd King talked about the SPASE Data Model.
Okay, I finally got some of the webstreaming working (I needed an updated plugin). I managed to watch some of the medal award lectures after the fact from the EGU webstream page. Turns out the medal award lectures are not so interesting (although Leonard Bengtsson’s lecture on extra-tropical cyclones is worth it for his observations about the current state of the art in modeling cyclones).
However, the press conferences are far more interesting:
- The press conference on uncertainties in climate change is definitely worth it to hear scientists separate what we know from what we don’t know, along with a basic introduction to the principles of climate modeling. Make sure you watch at least until the “wheel of fortune” bit (about halfway through). Bottom line: quantifying uncertainty is crucial. Over dinner this evening we were wishing that other fields (e.g. economics) would be anywhere near this willing to quantify their uncertainties…
- The press conference on improving outreach and education in the cryosphere is great for lots of facts and figures about the frightening rate at which glaciers and sea ice are melting, and the wide ranging implications (it’s a little slow to get going, but worth it once the panelists start).
- The press conference on ocean acidification packs a powerful punch. It starts with the screening of a film about how absorption of CO2 by the oceans leads to dramatic change, told from the perspective of how it will affect the generation of kids growing up today.
I missed out on liveblogging the last session on Tuesday on Seamless approaches in weather and climate, because the room had no power outlets at all, and my battery died. Which is a shame, as it was very interesting. The aim of seamless assessment is to be able to move back and forth between weather forecast models and climate models.
- Understanding and modeling organized tropical convection and its global impacts. This is a key problem in predictability of weather beyond about a week, and a major factor in the regional differences in climate variations within the overall climate change trends.
- Predicting weather and climate extremes in a changing climate (e.g. tropical cyclones, floods, droughts, coastal inundation, etc)
- Integrating earth system models and observations. Or: how to build a scientifically-based, internally consistent record of how the earth system is evolving over time.
Randall also identified an opportunity to provide better information for energy and climate policy, for example to assess the likely unintended consequences of major new energy projects, geo-engineering proposals, etc.
David Williamson from NCAR described the Transpose-AMIP project, which takes a set of models (both numerical weather prediction (NWP) and climate models) and runs them against a benchmark of 2 months worth of real weather observations. The aim is to analyze the primary errors, and is especially useful for comparing parameterization schemes with the field data, to track down which parameterizations cause which forecast errors. The NWP models did dramatically better than the climate models, but probably because they are highly tuned to give high quality forecasts of things like precipitation.
Keith Williams from the UK Met Office Hadley Centre talked about progress on a new initiative there on seamless assessment. The aim is to get all of the Met Office models to use the same physics schemes, from the 2-day weather forecast model all the way to the centennial and regional climate models. (Except in cases where it is scientifically justifiable to use an alternative scheme). Work by Rodwell and Palmer paved the way for this. One of the big challenges is to predict extreme events (e.g. heavy storms and flash floods) under climate change. Keith demonstrated why this is hard, with an example of a particular flood in North Cornwall, which is only predicted by high resolution weather forecast models on a 1.5km grid, and not by climate models working on a 20km grid). The problem is we can’t say anything about the frequency of such events under future climate change scenarios if the models don’t capture them.
Frank Selten gave a talk on EC-Earth, a project aimed at extending the ECMWF weather forecast model, currently rated as the best in the world for medium range weather forecasts, and creating a longer range climate model. Interestingly, the plan is to synchronize this effort with the ECMWF’s seasonal model, rather than forking the code. [Note: In conversations after the talk, we speculated on what software engineering problems they might encounter with this, given that the two will be developed at different sites in different countries. My work at the Met Office suggested that a major factor in their success at keeping the weather and climate models integrated is that everything is done in a single building at a single site. EC-Earth might make a good case study for me]. Oh, and they’ll be using the climate prediction index introduced by Murphy et al to assess progress.
Finally, Prashant Sardeshmukh blew my mind, with his description of the twentieth century reanalysis project. The aim of this project is to recreate an entire record of 6-hour measurements of near-surface and tropospheric temperatures, extending back to the start of the 20th century, using all the available observational data and a 56-model ensemble of climate models. Once they’ve done that they plan to go all the way back to 1871. They do this by iteratively improving the estimates until the models and the available field data converge. I amused myself by speculating whether it would be easier to invent a time machine and send a satellite back in time to take the measurements instead…
Not much to report from this morning, but here’s a few interesting talks from this afternoon:
15:30: Dick Schaap, speaking about SeaDataNet. Another big European project: 49 partners and 40 data centres. Most of the effort focusses on establishing standard data formats and metadata descriptions. The aim is to collect all the data providers into a federated system, with single portal, with a shopping basket for users to search for data they need, and secure access to them through a single sign-on. Oh, and they use Ocean Data View (ODV) for interactive exploration and visualization.
15:45: Roy Lowry, of the British Oceanographic Data Centre, whose talk is A RESTful way to manage ontologies. He covered some of the recent history of the NERC Datagrid, and some of the current challenges. 100,000 concepts, organised into about 100 collections. Key idea was to give each concept its own URN throughout the data and metadata, with a resolving service to get URLs from URNs. URLs instantiated as SKOS documents. Key issues:
- Versioning – if you embed version numbers in the URNs, you have many URNs per concept. So the lesson is to define the URN syntax so that it doesn’t include anything that varies over time.
- Deprecation – you can deprecate conecpts by moving the collection, so that the URN now refers to the replacement. But that means the URN of the deprecated concept changes. Lesson: deprecation implemented by change of status, rather than change of address.
- WSDL structure – RDF triples are implemented as complex types in WSDL. So adding new relationships requires a change in the WSDL, and changing the WSDL during operation breaks the system.
16:45: Oh, darn, I’ve missed Fred Spilhaus’s lecture on Boundless Science. Fred was executive director of the AGU for 39 years, until he retired last year. I’m in the wrong room, and tried the webstreaming, but of course it didn’t work. Curse this technology…
17:30: Now for something completely different: Geoengineering. Jason Blackstock, talking on Climate Engineering Responses to Climate Emergencies. Given that climate emergencies are possible, we need to know as much as possible about possible “plan B”s. Jason’s talk is about the outcome of a workshop last year, to investigate what research would be needed to understand the effects of geoengineering. They ignored the basic “should we” question, along with questions on whether consideration of geoengineering approaches might undercut efforts to reduce GHG emissions.
Here’s the premise: we cannot rule out the possibility that the planet is “twitchy“, and might respond suddenly and irreversibly to tipping points. In which case we might need some emergency responses to cool the planet again. Two basic categories of geogengineering – remove CO2 (which is likely to be very slow), or increase the albedo of the earth just a little bit (which could be very fast). The latter options are the most plausible. The most realistic of these are cloud whitening and stratospheric aerosols, so that’s what the workshop focussed on. We know aerosols can work fast because of the data from the eruption of Mt Pinatubo. Ken Caldeira and Lowell Wood did some initial modeling that demonstrated how geoengineering through aerosols might work.
But there are major uncertainties: transient vs. equilibrium response. Controllability and reversability; ocean acidification continues unaffected; plus we don’t know about regional effects, and effects on weather systems. Cost is not really an issue: $10B – $100B per year. But how do we minimize the potential for unanticipated consequences?
- Engineering questions: which aerosols? Most likely sulphates. How and where to deploy them? Lots of options.
- Climate science questions: What climate parameters will be affected by the intervention? What would we need to monitor? We need a ‘red team’ of scientists on hand to calculate the effects, and assess different options.
- Climate monitoring: what to we need to measure, with what precision, coverage, and duration, to keep track of how the deployment proceeding?
If we need to be ready with good answers in a ten-year timeframe, what research needs to be done to get there? Phase 1: Non-intervention research. Big issues: hard to learn much without intervention. Phase II: field experiments. Big issues: can’t learn much from a small ‘poke’; need to understand scaling. Phase III: Monitored deployment.
Non-technical issues: What are sensible trigger conditions? Who should decide whether to even undertake this research? Ethics of field tests? Dealing with winners and losers from deployment. And of course the risk of ‘rogue’ geoengineering efforts.
Takehome messages: research into geoengineering responses is no longer “all or nothing” – there are incremental efforts that can be undertaken now. Development of an ‘on the shelf’ plan B option requires a comprehensive and integrated research program – this is a 10-year research program at least.
Some questions: How would this affect acid rain? Not much, because we’re talking about something of the order of 1% of our global output of sulphurous aerosols, plus problems of acid rain are reducing steadily anyway. A more worrying concern would be effect on the tropospheric ozone.
Who decides? There are some scientists saying already we’ve reached a climate emergency. If the aim is to avoid dangerous tipping points (e.g. melting of the poles, destruction of the rainforests), at what point do we pull the trigger? No good answer to this one.
Read more: Journal special issue on geo-engineering.
Chris Jones, from the UK Met Office Hadley Centre, presented a paper at EGU 2009 yesterday on The Trillionth Tonne. The analysis shows that the key driver of temperature change is the total cumulative amount of carbon emissions. To keep below the 2°C global average temperature rise generally regarded as the threshold for preventing dangerous warming, we need to keep total cumulative emissions below a trillion tonnes. And the world is already halfway there.
Which is why the latest news about Canada’s carbon emissions are so embarrassing. Canada is now top among the G8 nations for emissions growth. Let’s look at the numbers: 747 megatonnes in 2007, up from 592 megatonnes in 1990. Using the figures in the Environment Canada report, I calculated the Canada has emitted over 12 gigatonnes since 1990. That’s 12 billion tonnes. So, in 17 years we burnt though more than 1.2% of the entire world’s total budget of carbon emissions. A total budget that has to last from the dawn of industrialization to the point at which the whole world become carbon-neutral. Oh, and Canada has 0.5% of the world’s population.
Disclaimer: I have to check whether the Hadley Centre’s target is 1 trillion tonnes of CO2-equivalent, or 1 trillion tonnes of Carbon (they are different!). The EnvCanada report numbers refer to the former.
Update: I checked with Chris, and as I feared, I got the wrong units – it’s a trillion tonnes of carbon. The conversion factor is about 3.66, so that gives us about 3.66 trillion tonnes of carbon dioxide to play with. [Note: Emissions targets are usually phrased in terms of “Carbon dioxide equivalent”, which is a bit hard to calculate as different greenhouse gases have both different molecular weights and different warming factors].
So my revised figures are that Canada burnt through only about 0.33% of the world’s total budget in the last 17 years. Which looks a little better, until you consider:
- by population, that’s 2/3 of Canada’s entire share.
- Using the cumulative totals from 1900-2002. plus the figures for the more recent years from the Environment Canada report (and assuming 2008 was similar to 2007) we’ve emitted 27 gigatonnes of CO2 since 1900. Which is about 0.73% of the world’s budget, or about 147% of our fair share per head.
- By population, our fair share of the world’s budget is about 18 gigatonnes CO2 (=5 gigatonnes Carbon). We’d burnt through that by 1997. Everything since then is someone else’s share.
Google tells me I’m not the only one blogging from the EGU meeting this week:
- Bente Lilja Bye is blogging about the sessions on Tsunami Risk;
- Dave Petley is blogging about landslides;
- Quirin Schiermeier is blogging on mountain hydrology and in particular the effect of climate change on glaciers in the alps; and comments on Weigel’s talk on seasonal forecasting;
And some others who might blog this week:
- Bryan Lawrence appears to be too busy giving talks to blog much this week (update: he did)
- Robert Huber tells me there is a biodiversity informatics session (found it);
But by and large, Google also seems to be telling me that this community doesn’t blog very much.
A leisurely breakfast this morning, chatting with Tim, so we didn’t make it to the conference until the coffee break.
10:30: From climate predictability to end user applications: on the route to more reliable seasonal ensemble forecasts, by Andreas Weigel, who is also a Young Scientist award winner. Most of the analysis is based on the ECMWF model. Uses probabilistic forecasts for seasonal predictions – e.g. 41 runs, with perturbed physics, and use probability density functions to create the forecasts. Uses RPSS (Ranked Probabilistic Skill Score) to compare model predictions with observations. Interesting point: if you get lots of random models, and do ensemble forecasts with them, they approach 0 on this skill scale. This kinds of analysis helps to identify bias in the skill score metric, so that this bias can be removed. Multi-models ensembles have been shown to outperform individual models, but this is a bit of a paradox, because the multi-models include less skillful models. Which implies you can improve your forecasts by adding lower skill models to the ensemble. The answer is to do with reducing overconfidence in the forecasts. Last topic for the talk: how can prediction skill be communicated to the public? Introduce an intuitive skill score that makes sense to the public. Rather than just adding up the accuracy of a series of forecasts, look at two different specific observations, and test whether the forecasts correctly distinguish them (this is known as 2AFC). Then add up the skill as the sum of these tests (okay, I’m not sure I’ve got my head around how this works – I’ll need to read the paper…). Oh, and I like the cartoon on the second slide of this talk.
11:15: change of sessions, and I’ve come in partway through Chris Jones’ talk – “Impact of cumulative emissions of carbon dioxide: the trillionth tonne” (Chris is from the UK Met Office, and I had lots of interesting discussions with him last summer). He’s talking about modeling experiments to determine what reduction in emissions is needed to meet the target of stabilizing climate to the 2°C target. Here’s an interesting emergent result from the modeling: peak warming is related strongly to the total cumulative emissions, rather than the specific pathway (i.e. when the emissions occur). This leads to an observation that you should set a total emissions budget as a policy, without constraining when these emissions should occur. Best answer: total emissions should be no more that 1 trillion tonnes of carbon. We’re halfway there right now! So over the next 40 years or so, we mustn’t emit more than 1/2 trillion tonnes. But the longer we leave it before peak emissions, the more dramatic the cuts after that will have to be. The bottom line is that this analysis greatly simplifies climate negotiations, because it makes the target very clear.
11:30: “Marine oxygen holes as a consequence of oceanic acidification“, presented by Matthias Hofmann. It’s well known that higher CO2 levels leads to ocean acidification, which reduces the ability of shellfish and coral to grow, because it inhibits calcification. But how quickly does this occur under different emissions scenarios? There is one bit of good news: there’s a negative feedback – reduced biogenic calcification has a negative effect on atmospheric CO2. But there are also some other effects that are more worrying: the massive growth of oxygen holes, because of oxidation of organic matter in shallow water. This has very worrying implications for marine life. (Here’s the paper).
11:45: last talk before lunch: Quantifying DMS-cloud-climate interactions using the ECHAM5-HAMMOZ model. The CLAW hypothesis suggests there is a negative feedback loop between the ocean and atmosphere, because warmer oceans enhance the growth of phytoplankton leads to increased So2 and hence more clouds (here’s a nice diagram that explains the feedback). Not sure I can summarize the results of the study presented here, except that they showed the effect is seasonal in nature.
Note to self: get to the sessions earlier and find a seat near a power outlet.
Lunch: I managed to visit the exhibition and pick up a couple of books:
- Avoiding Dangerous Climate Change, a collection of papers resulting from a workshop held at the Met Office in 2005.
- A History of the Science and Politics of Climate Change, written by Bert Bolin, the first chair of the IPCC.
13:30: Ray Bates, giving a talk entitled Climate Feedbacks: Some Conceptual and Physical Issues. Ray is receiving the Vilhelm Bjerknes Medal, and this is the lecture associated with the medal. Standing room only (but I got here first and nabbed one of the only power outlets). Ray started off by giving a little restrospective on his career, starting with his PhD with Charney. Likes the idea of being an Irishman studying tropical dynamics!
Here’s the key idea: most dynamical systems are characterized by negative feedbacks – which keep the system stable. Climate scientists appear to be an exception – they assume climate systems are subject to positive feedbacks that lead to runaway warming. So scientists outside of climate science are often skeptical. To understand this, you first have to understand what the zero feedback case is, and then figure out what we mean by positive/negative feedback. Ray presents four different definitions of “feedback”, F1 from control theory, F2 from electronics, and then two from climate science: F3: a stability altering feedback, and F4, a sensitivity-altering feedback. Ray then points out that any pair of these can give the opposite sign when applied in a particular way to the same system. He then goes on to give several more definitions of different types of feedback in the climate literature. (here’s the paper). Bottom line: an urgent need for a common definition (or set of definitions), so that readers of the climate literature know what we’re talking about.
Ray then gives a long account of Lindzen’s BAMS paper on cloud feedback effects – the paper that causes Lindzen to argue that climate scientists are being alarmist about global warming, because their model (the LCH model) gives a much lower figure for climate sensitivity. Several problems with the LCH model: e.g. it doesn’t include explicit heat transport between the tropics and extra-tropics. Adding these in explicitly gives a very different set of dynamics. With an extended LCH model (with these heat transports) it’s possible to choose parameters that give the opposite feedback effects than when those same parameters are used in the LCH model. (alright, this is a gross simplification of the analysis…) Bottom line: unless we’re much clearer about what we mean by feedback, a lot of the confusion will remain.
14:15: Martin Claussen, giving a talk entitled Is the Sahara a Tipping Element? This work looked at periods in prehistory when the Sahara region was green – covered with grassland. From both the models and the marine sediment cores, it appears that the Sahara flips readily between a ‘green’ state and the desert state, and it only takes a small increase in rainfall to reach this tipping point. As the general circulation models suggest such an increased rainfall as a result of global warming, it’s possible that the Sahara could change dramatically in the coming decades. However, it’s not clear whether it’s a single tipping point, or multiple swings (e.g. different swings for the Eastern vs. Western Sahara). Here’s a summary of the work.
14:30: Peter Brockhaus, giving a talk on soil-moisture feedback effects. Here’s another dilemma about feedbacks. Two different runs of a model (the CCLM) at different resolutions (2.2km and 25km) give soil-moisture feedback effects that are opposite in sign. (Here’s the paper)
14:45: Hezi Gildor, on Lightning-biota feedback effects. This one is fascinating: increased temperature leads to increased incidence of lightning, which generates nitrogen compounds that stimulate plant growth. It also makes the grass greener! The analysis indicates this feedback effect is small, but not necessarily insignificant, so it might need to be investigated in earth system models. [my thought: this begs the question – how many of these different feedback effects do we need to track down and incorporate into the general models? Each new effect that we add increases the complexity of the model, and increases the complexity of the coupling…] Now it gets complicated: one of the questioners points out that lightning also causes forest fires, which burns vegetation (in the short term) but which also stimulate more forest growth (in the long term). More feedback effects to account for!
Time for a break, and some ice cream in the hot Austrian sun.
15:30: Larry Hinzman, talking about Hydrological Changes in the Polar Regions: An Analysis of Linkages and Feedbacks. It’s already getting noticeably drier in many polar regions (many lakes are shrinking), but as the permafrost melts, it generally subsides and significantly increases groundwater, which makes these regions wetter. The connections between different processes here are complex, and Larry indicated they are making progress on sorting them out an quantifying them. [He mentioned a new paper (in submission) that has some nice graphics indicating the linkages]. I did find this recent paper, which summarizes many of the changes in Actic hydrology that have already been seen.
16:45: Emma Stone: Could vegetation feedbacks determine whether the Greenland ice sheet regrows after deglaciation? This is a long-term question – if we lose the Greenland ice-sheet, will it eventually re-grow once greenhouse gas concentrations stabilize? Two previous studies offer conflicting answers: Lunt’s work suggested it might regrow in 20,000 years, while Toniazzo’s study indicated that it would not happen at all. Emma is running a series of experiments using HadCM3 from the UK Met Office to investigate. She initializes the model with bare soil (for one treatment) and needle leaf (for another treatment), tested under a return to pre-industrial CO2 concentrations. She found that in some runs, some glaciation reappears on the Greenland’s eastern coast, but it depends on assumptions about vegetation. In other words, vegetation feedback effects are critical here for answering the question. Of course, this all pre-supposes that we ever do return to pre-industrial CO2 concentrations…
Okay, here’s my first attempt to do liveblogging from the EGU General Assembly in Vienna. I’ve arrived and registered, but now I face my first problem: the sheer scale of the thing. The printed program runs to 140 pages, and it doesn’t even tell you the titles and authors of any of the papers – it mainly consists of table after table mapping session titles to rooms. Yikes!
And the next problem is that the wireless internet is swamped at 10,000 geoscientists all try to check their email at once. There’s this neat tool on the conference website that allows you to click on any talk or session while you’re browsing the online program, to add them to your personal program. But as I can’t get the website to load now, I can’t see what I put on my personal program. Ho hum. Ah, it’s loaded.
Looks like I got here too late for Stefan Rahmsdorf’s review of the stability of the ocean circulation under climate change scenarios. I wanted to see it because Stefan has been consistently warning of higher sea level rise than the numbers in the IPCC reports.
Okay, next interesting paper is in the Earth Systems Informatics session, entitled “Semantic metadata application for information resources systematization in water spectroscopy“. Let’s see if I can find the room before the talk is over….
14:56: Found the room, but talk is nearly over. Next up is more promising anyway: Peter Fox, talking about Semantic Provenance Management for Large Scale Scientific Datasets…
15:06: Peter Fox is up. I should point out that this is the last talk in the session (which is on semantic interoperability, knowledge and ontologies). He’s showing some data flow analysis of the current way in which scientific images get passed around – processed images (e.g. gifs) get forwarded without their metadata. He’s characterized the problem as one of combining information from two different streams – the raw images from the instruments (which then get processed in various ways), and comments from observers (both human and tools) who are adding information about images. Another use case: two different graphs from the same data – a daily time series and a monthly time series (for Aerosol Optical Thickness), but the two averagings are produced by different tools, and one tool inverts the Y axis. The scientist working with the two graphs spent two days trying to figure out why the two series seemed to contradict each other! The key idea in this work seems to be the use of a semantic markup language, PML, for capturing additional information on datasets.
15:30: Next session is on International Informatics Collaborations and first speaker is Bryan Lawrence, (who incidentally, I’ve been meaning to get in contact with since he commented on Jon’s blog back in December about our brainstorming sessions). He’s talking about the European Contribution to a Global Solution For Access to Climate Model Simulations. Much of the data is available at WDCC: the World Data Centre on Climate. From this summer, they are expecting to collect a petabyte of data per year. Main task right now: CMIP5, which is the collection of model runs ready for comparison and analysis for the next IPCC assessment. Overall impression of Bryan’s talk: lots of technology problems, just shipping large streams of data around, authentication, and not enough bandwidth! Oh, and there’s a new European framework 7 project just starting up, IS-ENES, which is supposed to increase the collaboration around earth system modeling tools and frameworks.
15:45: Talk on EuroGEOSS, another big European project, just about to kick off next month. It’s hard to get my head around these big interoperability projects: to the outsider, it’s hard to tell what the project will actually focus on.
16:00: I like the title of the next one: Herding Cats: The Challenges of Collaboratively Defining the Geology of Europe, presented by Kristine Asch. Here’s her motivation: Rich data exists, it’s critical for society, but it’s hard to find, access, share, understand. There is an EU directive on this, called INSPIRE: Infrastructure for Spatial Information in Europe. Looks like the directive involves lots of working groups. Kristine is talking about OneGeology, a project to create an interoperable geological map of Europe. The challenge is that everyone currently maps their geological survey data in different (incompatible) ways. Here’s an example of how deep the problem goes: 20 people sitting around a table trying to reach agreement on what terms like “bedrock” and “surface geology” mean. Lots of cultural challenges: 27 countries (each with their own geological survey organisation), 27 different standards, national pride, and everyone speaks different variants of English.
Question Session: here’s a good question – have they thought of bringing in any social scientists or anthropologists to study the communities involved in this, to ensure we learn appropriate lessons from the experience. Okay, I’m off on a tangent now, because this question reminded me of an old colleague, Susan Leigh Star, and I just discovered she has a book out that I somehow missed: Sorting Things Out: Classification and it’s consequences.
16:30: A talk on Siberia. Actually, on the Siberian Integrated Regional Study, important because of the role of melting permafrost in Siberia and its effect as a climate feedback. The disappointing thing about this talk is that he’s talking a lot about creating web portals, and not much about the real challenges.
17:30: Okay, change of scenery. There’s a session on Education, Computational Methods and Complex Systems in Nonlinear Proceses in Geophysics. First talk by Jeffrey Johnson (prof of complexity science) is about eToile, a Social Intelligent ICT-System for very large scale education in complex systems. Jeff was instrumental a few years ago in setting up the Complex Systems Society. The idea behind eToile (see also related project: Assyst) is to change how we design a curriculum. Normally: define curriculum & learning outcomes, create course materials, deliver them, examine the students, mark exam scripts, and pass/fail/grade the students. This is enormously expensive, especially the bits that are proportional to the number of students (e.g. marking is linear in # of students). Can set up a web portal to provide curriculum and assessment, with semi-automated marking. But how do we provide appropriate learning resources for very large numbers of students to use? Solution: create a “resource ecology”, initially populated with some junk URLs. When students submit work, they also have to submit the web resources they used; these become part of the ecology, and links that many students find useful rise in the ecology. Also, stuff that gets outdated falls in value as fewer students use it, or if the curriculum changes, you get the same effect. Jeff argues that this is a much cheaper, more scaleable way of providing education. In the question session, he clarified: it’s only intended for grad students (who are sufficiently capable of self-study), and probably wouldn’t work for undergrads. Reminds me of some of the experiments we’ve played with on my grad courses with the students contributing web resources to our growing collection.
17:45: Next up: a talk on education around flood management. Big paradigm change, from fighting floods, where engineers are the dominant stakeholder, to “living with floods” when it becomes much more of a shared issue among many different kinds of stakeholder. Set up displays, with lots of cylinders of water, to help people visualize different flood levels. Lots of hands on learning to get stakeholders to take ownership of the problem.
Okay, jetlag kicking in seriously. Off to get some air…
Had an interesting conversation this afternoon with Brad Bass. Brad is a prof in the Centre for Environment at U of T, and was one of the pioneers of the use of models to explore adaptations to climate change. His agent based simulations explore how systems react to environmental change, e.g. exploring population balance among animals, insects, the growth of vector-borne diseases, and even entire cities. One of his models is Cobweb, an open-source platform for agent-based simulations.
He’s also involved in the Canadian Climate Change Scenarios Network, which takes outputs from the major climate simulation models around the world, and extracts information on the regional effects on Canada, particularly relevant for scientists who want to know about variability and extremes on a regional scale.
We also talked a lot about educating kids, and kicked around some ideas for how you could give kids simplified simulation models to play with (along the line that Jon was exploring as a possible project), to get them doing hands on experimentation with the effects of climate change. We might get one of our summer students to explore this idea, and Brad has promised to come talk to them in May once they start with us.
Oh, and Brad is also an expert on green roofs, and will be demonstrating them to grade 5 kids at the Kids World of Energy Festival.
As my son (grade 4) has started a module at school on climate and global change, I thought I’d look into books on climate change for kids. Here’s what I have for them at the moment:
Weird Weather by Kate Evans. This is the kids favourite at the moment, probably because of its comicbook format. The narrative format works well – its involves the interplay between three characters: a businessman (playing the role of a denier), a scientist (who shows us the evidence) and a idealistic teenager, who gets increasingly frustrated that the businessman won’t listen.
The Down-to-Earth Guide to Global Warming, by Laurie David and Cambria Gordon. Visually very appealling, lots of interesting factoids for the kids, and a particular attention to the kinds of questions kids like to ask (e.g. to do with methane from cow farts).
How We Know What We Know About Our Changing Climate by Lynne Cherry and Gary Braasch. Beautiful book (fabulous photos!), mainly focusing on sources of evidence (ice cores, tree rings, etc), and how they were discovered. Really encourages the kids to do hands on data collection. Oh, and there’s a teacher’s guide as well, which I haven’t looked at yet.
Global Warming for Dummies by Elizabeth May and Zoe Caron. Just what we’d expect from a “Dummies Guide…” book. I bought it because I was on my way to a bookstore on April 1, when I heard an interview on the CBC with Elizabeth May (leader of the Canadian Green Party) talking about how they were planning to reduce the carbon footprint of their next election campaign, by hitchhiking all over Canada. My first reaction was incredulity, but then I remembered the date, and giggled uncontrollably all the way into the bookstore. So I just had to buy the book.
First, to get this out of the way, the latest ad from Cato has been thoroughly debunked by RealClimate, including a critical look at whether the papers that Cato cites offer any support for Cato’s position (hint: they don’t), and a quick tour through related literature. So I won’t waste my time repeating their analysis.
The Cato folks attempted to answer back, but it’s largely by attacking red herrings. However, one point from this article jumped out at me:
“The fact that a scientist does not undertake original research on subject x does not have any bearing on whether that scientist can intelligently assess the scientific evidence forwarded in a debate on subject x”.
The thrust of this argument is an attempt to bury the idea of expertise, so that the opinions of the Cato institute’s miscellaneous collection of people with PhDs can somehow be equated with those of actual experts. Now, of course it is true that a (good) scientist in another field ought to be able to understand the basics of climate science, and know how to judge the quality of the research, the methods used, and the strength of the evidence, at least at some level. But unfortunately, real expertise requires a great deal of time and effort to acquire, no matter how smart you are.
If you want to publish in a field, you have to submit yourself to the peer-review process. The process is not perfect (incorrect results often do get published, and, on occasion, fabricated results too). But one thing it does do very well is to check whether authors are keeping up to date with the literature. That means that anyone who regularly publishes in good quality journals has to keep up to date with all the latest evidence. They cannot cherry pick.
Those who don’t publish in a particular field (either because they work in an unrelated field, or because they’re not active scientists at all) don’t have this obligation. Which means when they form opinions on a field other than their own, they are likely to be based on a very patchy reading of the field, and mixed up with a lot of personal preconceptions. They can cherry pick. Unfortunately, the more respected the scientist, the worse the problem. The most venerated (e.g. prize winners) enter a world in which so many people stroke their egos, they lose touch with the boundaries of their ignorance. I know this first hand, because some members of my own department have fallen into this trap: they allow their brilliance in one field to fool them into thinking they know a lot about other fields.
Hence, given two scientists who disagree with one another, it’s a useful rule of thumb to trust the one who is publishing regularly on the topic. More importantly, if there are thousands of scientists publishing regularly in a particular field and not one of them supports a particular statement about that field, you can be damn sure it’s wrong. Which is why the IPCC reviews of the literature are right, and Cato’s adverts are bullshit.
Disclaimer: I don’t publish in the climate science literature either (it’s not my field). I’ve spent enough time hanging out with climate scientists to have a good feel for the science, but I’ll also get it wrong occasionally. If in doubt, check with a real expert.