This week I’ve been attending a workshop at NCAR in Boulder, to provide input to the US National Acadamies committee on “A National Strategy for Advancing Climate Modeling”. The overall goal is to:

“…develop a strategy for improving the nation’s capability to accurately simulate climate and related Earth system changes on decadal to centennial timescales. The committee’s report is envisioned as a high level analysis, providing a strategic framework to guide progress in the nation’s climate modeling enterprise over the next 10-20 years”

The workshop has been fascinating, addressing many of the issues I encountered on my visits to various modelling labs last year – how labs are organised, what goes into the models, how they are tested and validated, etc. I now have a stack of notes from the talks and discussions, so I’ll divide this into several blog posts, corresponding to the main themes of the workshop:

The full agenda is available at the NRC website, and they’ll be posting meeting summaries soon.

The first session of the workshop kicked off with the big picture questions: what earth system processes should (and shouldn’t) be included in future earth system models, and what relationship should there be with the models used for weather forecasting?To get the discussion started, we had two talks, from Andrew Weaver (University of Victoria) and Tim Palmer (U Oxford and ECMWF).

Andrew Weaver’s talk drew on lessons learned from climate modelling in Canada to argue that we need flexibility to build different types of model for different kinds of questions. Central to his talk was a typology of modelling needs, based on the kinds of questions that need addressing:

  • Curiosity-driven research (mainly done in the universities). For example:
    • Paleo-climate (e.g. what are the effects of a Heinrich event on African climate, and how does this help our understanding of the migration of early humans from Africa?)
    • Polar climates (e.g. what is the future of the greenland ice sheet?)
  • Policy and decision making questions (mainly done by the government agencies). These can be separated into:
    • Mitigation questions (e.g. how much can we emit and still stabilize at various temperature levels?)
    • Adaptation questions (e.g. infrastructure planning over things like water supplies, energy supply and demand)

These diverse types of question place different challenges on climate modelling. From the policy-making point of view, there is a need for higher resolution and downscaling for regional assessments, along with challenges of modelling sub-grid scale processes (Dave Randall talked more about this in a later session). On the other hand, for paleo-climate studies, we need to bring additional things into the models, such as representing isotopes, ecosystems, and human activity (e.g. the effect on climate of the switch from hunter-gatherer to agrarian society).

This means we need a range of different models for different purposes. Andrew argues that we should design a model to respond to a specific research question, rather than having one big model that we apply to any problem. He made a strong case for a hierarchy of models, describing his work with EMICs like the UVic model, which uses a simple energy/moisture balance model for the atmosphere, coupled with components from other labs for sea ice, ocean, vegetation, etc. He raised a chuckle when he pointed out that neither EMICs nor GCMs are able to get the greenland icesheet right, and therefore EMICS are superior because they get the wrong answer faster. There is a serious point here – for some types of question, all models are poor approximations, and hence, their role is to probe what we don’t know, rather than to accurately simulate what we do know.

Some of problems faced in the current generation of models will never go away: clouds and aerosols, ocean mixing, precipitation. And there are some paths we don’t want to take, particularly the idea of coupling general equilibrium economic models with earth system models. The latter may be very sexy, but it’s not clear what we’ll learn. The uncertainties in the economics models are so large that you can get any answer you like, which means this will soak up resources for very little knowledge gain. We should also resist calls to consolidate modelling activities at just one or two international centres.

In summary, Andrew argued we are at a critical juncture for US and international modelling efforts. Currently the field is dominated by a race to build the biggest model with the most subcomponents for the SRES scenarios. But a future priority must be on providing information needed for adaptation planning – this has been neglected in the past in favour of mitigation studies.

Tim Palmer’s talk covered some of the ideas on probabilistic modeling that he presented at the AGU meeting in December. He began with an exploration of why better prediction capability is important, adding a third category to Andrew’s set of policy-related questions:

  • mitigation – while the risk of catastrophic climate change is unequivocal, we must reduce the uncertainties if we are ever to tackle the indifference to significant emissions cuts; otherwise humanity is heading for utter disaster.
  • adaptation – spending wisely on new infrastructure depends on our ability to do reliable regional prediction.
  • geo-engineering – could we ever take this seriously without reliable bias-free models?

Tim suggested two over-riding goals for the climate modeling community. We should aspire to develop ab initio models (based on first principles) whose outputs do not differ significantly from observations; and we should aspire to improve probabilistic forecasts by reducing uncertainties to the absolute minimum, but without making the probabilistic forecasts over-confident.

The main barriers to progress are limited human and computational resources, and limited observational data. Earth System Models are complex – there is no more complex problem in computational science – and great amounts of ingenuity and creativity are needed. All of the directions we wish to pursue – great complexity, higher resolution, bigger ensembles, longer paleo runs, data assimilation – demand more computational resources. But just as much, we’re hampered by an inability to get the maximum value from observations. So how do we use observations to inform modelling in a reliable way?

Are we constrained by our history? Ab initio climate models originally grew out of numerical weather prediction, but rapidly diverged. Few climate models today are viable for NWP. Models are developed at the institutional level, so we have diversity of models worldwide, and this is often seen as a virtue – the diversity permits the creation of model ensembles, and rivalry between labs ought to be good for progress. But do we want to maintain this? Are the benefits as great as they are often portrayed? This history, and the institutional politics of labs, universities and funding agencies provide an unspoken constraint on our thinking. We must be able to identify and separate these constraints if we’re going to give taxpayers best value for money.

Uncertainty is a key issue. While the diverisity of models often cited as measure of confidence in ensemble forecasts, this is very ad hoc. Models are not designed to span uncertainty in representation of physical processes in any systematic way, so the ensembles are ensembles of opportunity. So if the models agree, how can we be sure this is a measure of confidence? An obvious counter-example is in economics, where the financial crisis happened despite all economics models agreeing.

So can we reduce uncertainty if we had better models? Tim argues that seamless prediction is important. He defines it as “bringing the constraints and insights of NWP into the climate arena”. The key idea is to be able to use the same modeling system to move seamlessly between forecasts at different scales, both temporally and spatially. In essence, it means unifying climate and weather prediction models. This unification brings a number of advantages:

  • It accelerates model development and reduces model biases, by exploring model skill on shorter, verifiable timescales.
  • It helps to bridge between observations and testing strategies, using techniques such as data assimilation.
  • It brings the weather and climate communities together
  • It allows for cross-over of best practices.

Many key climate amplifiers are associated with fast timescale processes, which are best explored in NWP mode. This approach also allows us to test the reliability of probabilistic forecasts, on weekly, monthly and seasonal timescales. For example, reliability diagrams can be used to explore the reliability of a whole season of forecasts. This is done by subsampling the forecasts, taking, for example, all the forecasts that said it would rain with 40% probability, and checking that it did rain 40% of the time for this subsample.

Such a unification also allows a pooling of research activity in universities, labs, and NWP centres to make progress on important ideas such as stochastic parameterizations and data assimilation. Stochastic parameterizations are proving more skilful in NWP than multi-model ensembles on monthly timescales. The ideas are still in their infancy, but there is potential to pool research activity in universities, labs, and NWP centres. Data assimilation provides a bridge between modelling and observational world (As an example, Rodwell and Palmer were able to rule out high climate sensitivities in the climateprediction.net data using assimilation with the 6-hour ECMWF observational data).

In contrast to Andrew’s argument to avoid consolidation, Tim argues that consolidation of effort at the continental (rather than national) level would bring many benefits. He cites the example of the Aerobus, where European countries pooled their efforts because aircraft design had become too complex for any individual nation to go it alone. The Aerobus approach involves a consortium, allowing for specialization within each country.

Tim closed with a recap of an argument he made in a recent Physics World article. If a group of nations can come together to fund the Large Hadron Collider, then why can’t we come together to fund a computing infrastructure for climate computing? If climate is the number one problem facing the world, then why don’t we have number one supercomputer, rather than being around 50 in the TOP500.

Following these talks, we broke off into discussion groups to discuss a set of questions posed by the committee. I managed to capture some of the key points that emerged from these discussions – the committee report will present a much fuller account.

  • As we expand earth system models to include ever more processes, we should not lose sight of the old unsolved problems, such as clouds, ice sheets, etc.
  • We should be very cautious about introducing things into the models that are not tied back to basic physics. Many people in the discussions commented that they would resist human activities being incorporated into the models, as this breaks the ab initio approach. However, others argued that including social and ecosystem processes is inevitable, as some research questions demand it, and so we should  focus on how it should be done. For example, we should start with places where 2-way interaction is the largest. E.g. land use, energy feedbacks, renewable energy.
  • We should be able to develop an organized hierarchy of models, with traceable connections to one another.
  • Seamless prediction means same within the same framework/system, but not necessarily same model.
  • As models will become more complex, we need to be able to disentangle different kinds of complexity – for example the complexity that derives from emergent behaviour vs. number of processes vs. number of degrees of freedom.
  • The scale gap between models and observations is disappearing. Model resolution is increasing exponentially, while observational capability increasing only  polynomially. This will improve our ability to test models against observations.
  • There is a great ambiguity over what counts as an Earth System Model. Do they include regional models? What level of coupling defines an ESM?
  • There are huge challenges in bringing communities together to consolidate efforts, even on seemingly simple things like agreeing terminology.
  • There’s a complementary challenge to the improvement of ESMs: how can we improve the representation of climate processes in other kinds of model?
  • We should pay attention to user-generated questions. In particular, the general public doesn’t feel that climate modelling is relevant to them, which partly explains problems with lack of funding.

Next up: challenges arising from hardware, software and human resource limitations…

19. April 2011 · 2 comments · Categories: politics

For those outside Canada, in case you haven’t heard, we’re in the middle of a general election. Canada has a parliamentary system, modelled after the British one, with a first-past-the-post system for electing representatives (members of parliament), where party with the most seats after the election is invited to form a government, and its leader to become Prime Minister. For the last few parliaments we’ve had minority governments, first Liberal, then Conservative.

Somewhere along the way, many people just stopped voting: from turnouts in the high 70s back in the 60’s, we’ve had 64.7% and then 58.8% turnout respectively in the last two elections – the last being the lowest turnout ever. There maybe many different reasons for this lack of enthusiasm, although listening to the main parties whining about each other during this election, it’s not hard to see why so many people just don’t bother. But one thing is clear: young people are far less likely to vote than any other age group.

So it was great to see last week Rick Mercer with a brilliant call for young voters to use their votes to “scare the hell out of the people who run this country”:

And his message seems to have resonated. Students on campuses across the country have been using social networking to organise vote mobs, making videos along the way as they challenge others to do the same. But here’s the interesting thing. The young people of this country have a very different set of preferences to the general population:

Just look at how the projected composition of parliament would look it it were up to the youngsters: the Liberals and the Green Party virtually neck-and-neck for most votes, and instead of the greens being shut out of parliament, they’d hold 43 seats! Of course, the projected seat count also throws into sharp focus what’s wrong with our current voting system: the Bloq, with lowest share of the vote of any of the parties would still hold 60 seats. And the Liberals with just 2% more of the votes than the greens would still get more than twice as many seats. Nevertheless, I like this picture much more than the parliaments we’ve had in the last few elections.

So, if you’re eligible to vote, and you’re anywhere around half my age, make my day – help change our parliament for the better!

This is a guest post from Kaitlin Carroll, originally written as an assignment for the course PMU199S.

How does Canada measure up on climate change? We are of course an economically developed country, and what with all our snow, our beavers and our wilderness one would think we’re doing pretty good, right? Wrong. Maybe there are a few other criteria in the fight against climate change, besides the beavers.

Taken from epi.yale.edu
EPI Ranking Map. Highest scores are yellow, and scores lower as the colour gets darker.

On Yale’s EPI (Environmental Performance Index) Canada ranks 46th out of 163 countries, including currently developing countries such as China and places recently hit by climate-change spurred natural disasters like The Maldives and Haiti. Coming out on top are countries like Switzerland, which ranked second, and Sweden which ranked fourth.  So how does Canada compare to such high ranked countries?

Based on Yale’s EPI website we can draw some basic conclusions about quality of life in Sweden, Switzerland and Canada. The average GDP per capita of the three countries is just upwards of $34,000. 100% of citizens in all three of the countries also have access to basic needs, such as sanitation and clean water. The countries are also very similar in terms of local climate, which also plays a key role in quality of life, and emissions in terms of heating and cooling. In other words, Sweden, Switzerland and Canada are all developed nations with a great quality of life.

So, if Canada has such a great quality of life, we shouldn’t have any problems setting up environmental reforms and becoming leaders in the fight against climate change! How, then, are we ranked 46th?

The problem seems to lie in policy and practice. In an article by David Richard Boyd, a well-known environmental lawyer and Canadian, he compares Sweden and Canada on ten different environmental criteria. The reason why Canada falls behind is directly linked to their policies. While Sweden has used “innovative economic policies to reduce pressure on the environment” and produced “a bold national strategy to achieve sustainability within a generation”, Canada has taken a different approach. Canada is described in the article as a country which puts emphasis on voluntary contributions to fighting climate change, instead of enforcing policies. They also favour environmental education, which I agree is necessary, but education without action and the support of local government officials will have little effect.

Boyd also seems to cite a lot of the same key issues with Canada’s policy as The Conference Board of Canada.  The Conference Board of Canada is an independent and non-profit organization what provides research studies on a wide range of Canada’s economic and political policies.  In the board’s article on environment Canada is ranked 15th out of 17 of it’s peer countries, and given a C grade.  Below is a list of some of the important issues mentioned by both Boyd and The Conference Board of Canada:

Key Issues

Population density also plays a big role in climate change, according to a study by Christopher Kennedy, Professor at the University of Toronto. This is largely due to the fact that ground transportation (such as cars, trucks, and buses) contributes a lot to the GHGs we send into the atmosphere. The chart below compares the ten different cities in Kennedy’s study in terms of their population density and GHG output.

Population Density vs GHGs

Unlike Switzerland and Sweden, Canada is huge! We have a mix of rural and urban areas, but even within our urban centers, such as Toronto, we’re nowhere near as densely populated as most European cities. This means we have to travel a lot more from the suburbs to the down town core. Toronto’s Public Transit has a system length of 70km and 69 stations, Montreal’s Metro is nearly the same, with a length of 69km and 68 stations. Not only is it the distance, but it’s the service area of the transit that affects us. In both cases of Toronto and Montreal there are only four different subway lines, which means the service area is very restricted. We can compare this to Sweden’s Public Transit, the Stockholm Metro, which has 7 different lines, and although it covers about the same distance (105km) it serves 100 different stations. Public transit is then more accessible for those living within the city. In essence, because of our lack of population density and inadequate public transportation, Canada has become a car-oriented country.

The good news is, there’s still room for improvement! Canada can make changes to its policies to make climate change an issue of its government and its citizens as a whole, not just the select few. We can take examples from those countries ranked 1st through 45th and build upon and improve our society in order to ease pressure off of the earth.  The goal should be to bring every country to the number one spot, and there’s no reason why a country which has the means and the will cannot change.

Today Jonathan Lung released a new version of our open, shareable, web-based calculator, Inflo. We have a new screencast to explain what it is:

You can play with Inflo yourself (just say yes to accept the site certificate; you’ll need to register a new username if you want to save your calculations on the server). Or go see some of the calculations we’ve already built with it:

Or there’s always the tutorial for inflo in inflo itself….