8:30: This morning I’m at a session on education. Just my luck – I got here towards the end of the first talk, which looked very interesting. It was Zoe Robinson, from Keele, giving a talk entitled Effective and responsible teaching of climate change in Earth Science-related disciplines. She talked about many of the dilemmas faced in teaching of the earth sciences – e.g. attitudes of students such as climate skepticism, feelings of saturation coverage, etc., coupled with flawed knowledge. At the same time, accurate coverage of the complexity of the earth systems and areas of uncertainty can undermine the sense of urgency and responsibility appropriate for dealing with the challenges of climate change. Zoe talked about non-traditional techniques to use in the classroom to deal with these issues (but I missed most of it!)
8:45: Ilona Pojtok-Tari, Geography Netquipment (project webpage)
9:00: Marina Manea on Online Geodynamics. Talked about how they teach computational geodynamics. using various online modeling tools. The animations on convection in the earth’s mantle look pretty impressive, although the server is pretty slow.
The next two talks picked up a on common theme: how increased time pressure and reduced classroom time are eroding competence in students. Both talked about using other forms of instruction to combat this.
9:15: Stefan Winkler, talking about Geographic-didactical games as interactive tools. They noticed an emerging problem with lack of basic knowledge in physical geography, caused by reducing the time available for practice exercises, and a change in the timetable that separated lecture material from practical classes. So started experimenting with various forms of “edutainment” in evening sessions, or during fieldtrips and weekend seminars. These include quizes, a memory game, The students liked the games, and they clearly improved both motivation and knowledge.
9:30: Michael Mayer, talking about Using competence-based and project-related approaches to support students individually. He’s especially concerned about competencies such as critical thinking, work organization, teamworking, and so on. Aha! His pedagogical approach is based on constructivism. The students have to develop personal portfolios, and get regular email feedback on their individual learning style and selection of topics (plus regular grading to provide extrinsic motivation). Of course, it’s very time-consuming for the teacher. The seminars are based on development of mindmaps.
9:45: Varyl Thorndycraft, on use of Google Earth and virtual fieldwork in teaching physical geography. There’s plenty of evidence on the value of fieldwork in teaching physical geography, but limitations (cost, safety, time, etc) on how much is possible. Varyl suggests that virtual fieldwork can supplement (but not replace) real fieldwork. Google Earth provides the ability to measure distance and elevation, to visualize spatial scales by zooming, and visualize landscapes via the tilt. But some of the height measurements are inaccurate, and there’s no control over when images were taken (and they may change from time to time, which can affect planning for virtual fieldwork). He demonstrated some uses of google earth to cover a wide range of topics, where real fieldwork would not be possible.
After some coffee (and a complaint from the person sitting next to me during that session that my keyboard is too loud!), we have this:
10:30: Thomas Stocker, from the University of Bern, and co-chair of the IPCC working group I, receiving the Hans Oeschger Medal. His talk picks up on the idea of the earth being “twitchy” that I mentioned yesterday. It’s entitled From Abrupt Change to the Future. Hmm – big audience for this one. Good job I got here early, and that my battery is fully charged, as it’s that room with no power outlets again. Okay, here we go. First a little retrospective on the work on Hans Oeschger, from his early work on ice cores to early models of CO2 exchange in the atmosphere. In 1978, long before Kyoto, Hans was publishing analysis of stabilization of CO2, including that CO2 emissions needed to peak shortly after the year 2000, and then drop dramatically, in order to stabilize concentrations in the atmosphere. The essential knowledge was available back in 1978 about what the challenge was and what we needed to do, and (looking at the Keeling curve), there’s been no progress since then on addressing the challenge. In 1984, from greenland ice cores, they demonstrated that there was a consistent signal of abrupt climate changes from different locations.
In 1984, Broeker developed a famous visualization of the ocean conveyer belt. The key idea is that the North Atlantic cools the southern hemisphere. Through the 1990’s there was a debate because the paleoclimactic record did not support the theory that the Northern and Southern hemispheres were linked in this way. Then in 2003, after reanalysis of the data, the connection between Greenland and Antarctic warming became clear, and a new very simple model of these processes was developed. This led to a much better understanding of abrupt climate change. Lessons from this story: be bold (but not too often), be persistent, and be open and critical – sometimes have to re-think the models, sometimes have to re-analyze the data to make sense of (apparently) conflicting evidence.
1987, Broeker (again) published a paper in Nature on “unpleasant surprises in the greenhouse“. Led in the 1990s to an analysis of thresholds in the more complex 3D climate models. But even in 2007, the models are all over the place in analyzing these thresholds. So how do we know what kinds of abrupt changes are in store, and whether these are irreversible? For example, plenty of people think we have already passed a threshold for arctic sea ice melting. Many regions of the world are likely to experience irreversible drought (from Solomon et al 2009).
So, to finish, some open questions: where is the geographical and physical origin of abrupt climate change in the past? What are the triggers? What are the rates of change? what are the impacts of these changes on water cycle, etc.? Are there thresholds in the climate system, and what finding in this space are robust? (if we’re too bold here it will hurt our credibility). What controls potential instabilities in the ice sheets?
11:55: Change of sessions again, to the one on Climate Prediction: Models, Diagnostics, and Uncertainty Analysis. I missed this morning’s session on model diagnostics, but just arrived in time to catch Hermann Held talking about the Effects of climate uncertainties on welfare optimal investment streams into mitigation technologies. The problem: society has to invest in mitigation strategies, but under uncertain conditions. This makes the economics assessment of welfare optimization complicated. Economists tend to ask what it would cost to constrain global temperature rise to 2°C, and discount the future costs, under the assumption that mitigation strategies impact economic growth by constraining energy use. Uncertainty over climate sensitivity has a big impact on economically optimal emissions path, which means you can generate almost any answer to the economics question by choosing a different figure for sensitivity. The study design is interesting – here’s the paper,
13:30: After lunch, the topic of the session switches to uncertainty in projections and impacts. First up: Ronald Prinn, from MIT, talking on Probabilistic Forecast for 21st Century Climate Based on Uncertainties in Emissions. Based on an interesting model (IGSM) that couples climate simulation with economics concerns, and performs uncertainty analysis on the economics aspects, especially the growth of emissions under different climate policies. Lots of interesting probability analyses – too much for me to summarize, but here’s the paper. Oh, and he showed the ‘”wheel of fortune” I mentioned in my last post.
13:45: Andrei Sokolov, also from MIT, talking on Sensitivity of climate change projections to uncertainties in the estimates of observed changes in deep-ocean heat content. Too much detail in this talk for me to follow…
14:00: Ana Lopez, from the London School of Economics. From climate model ensembles to climate change impacts. The environment agency in the south west of England is interested in exploring the added value of ensembles of climate models in their everyday decision making, especially around water resource management. Used a perturbed physics ensemble from climateprediction.net (which is based on HadCM3), and a 21 model ensemble from CMIP3. Need to do some downscaling to get useful results to input to an impact model to make decisions about adaptation. (here’s an early paper from this work)
14:20: Next up is Stefan Fronzek’s talk on Probabilistic projections of climate change effects on sub-arctic palsa mires, in which I learnt that palsa mires are permanently frozen peat hummocks, which are sensitive to climate change, and if they melt are a major source of methane. Again, using perturbed physics ensembles from the Hadley Centre, the idea is to get probability density functions from the models to use for impact assessment. Sample result: risk of total loss of palsa mires estimated to be 80% by 2100 under scenario A1B.
14:35: Roberto Ferrise, talking on Climate Change and Projected Impacts in Agriculture: an Example on Mediterranean Crops. Starts by setting risk thresholds for minimum crop yield. Then takes climate change probability functions (again, using the Hadley Centre’s joint probabilities), and uses these as input to a crop growth model. Here, you have to take the effect of increase CO2 (which stimulates crop growth), decreased precipitation (drought!) and increased temperature (both of which inhibit growth). The impact is very different on different crop types, and different regions across southern Europe, and is also different over time, depending on which of the various factors dominate. For example for durum wheat, we see increased yield for the next few decades, followed by loss of almost all productivity by mid century. For grapevines, the risk of lower productivity drops everywhere except in northern Italy and Northern Greece, where the risk continually rises. One of the questioners asked about uncertainty in the crop models themselves, as opposed to uncertainty in the climate. This is an interesting question, because the analysis of uncertainty in climate models looks like it is a far more mature field than analysis of uncertainty in other types of model.
14:50: Last talk in this session is by Nathan Urban on Probabilistic hindcasts and projections of the coupled climate, carbon cycle, and Atlantic meridional overturning circulation systems. Uses some simple box models (of MOC, Climate, and Carbon cycle) coupled together, to assess the probability of experiencing an MOC collapse within the next 200 years vs. triggering a collapse (i.e. setting up conditions such that a collapse will occur by 2300). The good news is that the probability of experiencing a collapse within the next 100 years remains less than 10% (consistent with the IPCC), but rises rapidly after 2100. However the probability of triggering (committing to) a collapse is about 25% in 2100 and 75% by 2200. Of course the results depend on emissions scenario used, and this is only a relatively simple climate model. (Here’s the paper).
Quick chat to the MIT folks in the coffee break to understand their development processes, especially how they cope with such a multidisciplinary team contributing to the models. Key to success (just like at Hadley) appears to be to get everyone in the same building. They do have one additional challenge which is a collaboration with the folks at Woods Hole, but as that’s only 60 miles, the Woods Hole folks make it up to weekly coordination meetings at MIT.
15:30: Okay, downstairs, to the session on Information and services technologies for Earth and Space Sciences. First speaker: Bryan Lawrence, of BADC, talking about the Moles project (Metadata Objects for Linking the Environmental Sciences). He started with a quick history of the NERC Data Grid (which includes lots of cute acronyms: COWS, MILK, MOLES). Moles: the core goal is to support users thoughout the process, where they start by trying to find data, browse if they don’t know what they’re looking for, and then manipulate the data they find in different ways. Characterize the different types of metadata involved. (quote: “we would never try to integrate discipline-specific metadata”). Teleporting & orienteering – jump to something that’s close to the kind of data you want, and then move around in the local space to find it. Define the metadata that’s common across disciplines (he walked us through some example schemas). Using a GML application schema is clearly not a sufficient condition for interoperability, and only maybe is a necessary condition, but plenty of evidence that it helps. (lots more, but he talks faster than I can type…phew.)
15:45: John Laxton, talking about GeoSciML v2: an interchange and mark-up language for geologic information. GeoSciML is a GML application schema for geology information. As Bryan said, GMLs don’t give you everything: to genuinely exchange information you need to look at the semantics. V2.0 was released at the AGU fall meeting in 2008. He’s walking through some example schemas, e.g. for Geological Feature; and Geological Unit, Geologic Structure. (and here’s a paper). Oh, and I forgot that this is supporting the OneGeology project I mentioned the other day.
16:00: The next few papers were also on various Markup Languages: Marc Löwner, talked about GeoGML, including the challenges in developing the UML models for geological morphology, capturing both structural and process relationships between concepts. Ilya Zaslavsky talked about the CUAHSI Water Markup Language (WaterML); and Todd King talked about the SPASE Data Model.