One of the things that came up in our weekly brainstorming session today was the question of whether climate models can be made more modular, to permit distributed development, and distributed execution. Carolyn has already blogged about some of these ideas. Here’s a little bit of history for this topic.

First, a very old (well, 1989) paper by Kalnay et al,  on Data Interchange Formats, in which they float the idea of “plug compatibility” for climate model components. For a long time, this idea seems to have been accepted as the long term goal for the architecture for climate models. But no-one appears to have come close. In 1996, David Randall wrote an interesting introspective on how university teams can (or can’t) participate in climate model building, in which he speculates that plug compatibility might not be achievable in practice because of the complexity of the physical processes being simulated, and the complex interactions between them. He also points out that all climate models (up to that point) had each been developed at a single site, and he talks a bit about why this appears to be necessarily so.

Fast forward to a paper by Dickinson et al in 2002, which summarizes the results of a series of workshops on how to develop a better software infrastructure for model sharing, and talks about some prototype software frameworks. Then, a paper by Larson et al in 2004, introducing a common component architecture for earth system models, and a bit about the Earth System Modeling Framework being developed at NCAR. And finally, Drake et al.’s Overview of the Community Climate System Model, which appears to use these frameworks very successfully.

Now, admittedly I haven’t looked closely at the CCSM. But I have looked closely at the Met Office’s Unified Model and the Canadian CCCma, and neither of them get anywhere close to the ideal of modularity. In both cases, the developers have to invest months of effort to ‘naturalize’ code contributed from other labs, in the manner described in Randall’s paper.

So, here’s the mystery. Has the CCSM really achieved the modularity that others are only dreaming of? And if so how? The key test would be how much effort it takes to ‘plug in’ a module developed elsewhere…

Well, here’s an interesting analysis of the lifecycle emissions of a computer. Turns out that computers require something like ten times their weight in fossil fuels to manufacture. Which is an order of magnitude higher than other durable goods, like cars and fridges, which only require about their own weight in fuel to manufacture. Oh, and the flat screen display accounts for the majority of it.

Okay, so I’ll concede that my computer is an order of magnitude more useful to me than a fridge (which after all, only does one thing). But it does mean that if we focus only on power consumption during use, we might be missing the biggest savings opportunities. 

The analysis is from the book Computers and the Environment, edited by Kuehr and Williams, and here’s a brief review.

Well, this is a little off topic, but we (Janice, Dana, Peggy and I) have been invited to run this year’s International Advanced School of Empirical Software Engineering, in Florida in October. We’ve planned the day around the content of our book chapter on Selecting Empirical Research Methods for Software Engineering Research, which appeared in the book Guide to Advanced Empirical Software Engineering. It’s going to be a lot of fun!

There is no silver bullet for climate change, just as there’s no silver bullet for software engineering. To understand why this is, you need to understand the magnitude of the problem.

Firstly, there’s the question of what a “safe” temperature rise would be. There’s a broad consensus among climate scientists that about a rise of around 2°C (above pre-industrial levels) is a sensible upper limit. I’ve asked a number of climate scientists why this threshold, and the answer is that above this level, scary feedback effects start to kick in, and then we’re in serious trouble. If you look at the assessments from the IPCC, the lowest stabilization level they consider is 450 ppm (parts per million), but its clear from their figures that even at this level, we would overshoot the 2°C threshold. Since that report, some scientists have argued this is way too high, and 350ppm would be a better target. Worryingly, the last IPCC assessment was based on climate models that did not include feedback effects.

Then, there’s the question of how to get there. Stabilizing at 350-450ppm requires a reduction of greenhouse emissions of around 80% in industrialized nations by the year 2050. Monbiot argues that if you think in terms of a reduction per capita, you have to allow for population growth. So that really means a reduction more like 90% per person. And again, due to our uncertainty about feedback effects, the emissions targets may need to be even lower.

How do reduce emissions by 90% per person? The problem is that our emissions of greenhouse gases come from everything we do, and no one activity or industry dominates. I was looking for a good graphic for my ICSE talk, to illustrate this point, when I came across this chart of sources of emissions:

 

World Greenhouse Gas Emissions by Sector

World Greenhouse Gas Emissions by Sector

 

 

I think that’s enough on it’s own to show there is not likely to be a silver bullet. The only way to solve the problem is a systemic analysis of sources of emissions, and we have to take into account a huge number of different options. If you want more detail on the figures, Jon Rynn at Grist has started to put together some spreadsheets to add up all the sources of emissions, and some specific contributors.

BTW, the IPCC’s frequently asked questions is a great primer for anyone new to the physics of climate change.

Having ranted yesterday about how discussions about our personal carbon footprints are a distraction, I now feel obliged to raise an exception. If we’re talking about how to use our software expertise to support personal footprint reduction on a grander scale, I’m all for it. Along those lines, here’s a list of ten green Internet startups – companies looking to leverage Internet technology, to support ‘greening’ initiatives. These ten were presented at a conference in SF yesterday called Green:Net09, and a panel of judges selected a winner – presumably the company most likely to succeed. The judges picked WattBot, while the audience favourite was FarmsReach.

At many discussions about the climate crisis that I’ve had with professional colleagues, the conversation inevitably turns to how we (as individuals) can make a difference by reducing our personal carbon emissions. So sure, our personal choices matter. And we shouldn’t stop thinking about them. And there is plenty of advice out there on how to green your home, and how to make good shopping decisions, and so on. Actually, there is way too much advice out there on how to live a greener life. It’s overwhelming. And plenty of it is contradictory. Which leads to two unfortunate messages: (1) we’re supposed to fix global warming through our individual personal choices and (2) this is incredibly hard because there is so much information to process to do it right.

The climate crisis is huge, and systemic. It cannot be solved through voluntary personal lifestyle choices; it needs systemic changes throughout society as a whole. As Bill McKibben says:

“the number one thing is to organize politically; number two, do some political organizing; number three, get together with your neighbors and organize; and then if you have energy left over from all of that, change the light bulb.”

Now, part of getting politically organized is getting educated. Another part is connecting with people. We computer scientists are generally not very good at political action, but we are remarkably good at inventing tools that allow people to get connected. And we’re good at inventing tools for managing, searching and visualizing information, which helps with the ‘getting educated’ part and the ‘persuading others’ part.

So, I don’t want to have more conversations about reducing our personal carbon footprints. I want to have conversations about how we can apply our expertise as computer scientists and software engineers in new and creative ways. Instead of thinking about your footprint, think about your delta (okay, I might need a better name for it): what expertise and skills do you have that most others don’t, and how can they be applied to good effect to help?

In honour of Ada Lovelace day, I decided to write a post today about Prof Julia Slingo, the new chief scientist at the UK Met Office. News of Julia’s appointment came out in the summer last year during my visit to the Met Office, coincidentally on the same day that I met her, at a workshop on the HiGEM project (where, incidentally, I saw some very cool simulations of ocean temperatures). Julia’s role at the meeting was to represent the sponsor (NERC – the UK equivalent of Canada’s NSERC), but what impressed me about her talk was both her detailed knowledge of the project, and the way she nurtured it – she’ll make a great chief scientist.

Julia’s research has focussed on tropical variability, particularly improving our understanding of the monsoons, but she’s also played a key role in earth system modeling, and especially in the exploration of high resolution models. But best of all, she’s just published a very readable account of the challenges in developing the next generation of climate models. Highly recommended for a good introduction to the state of the art in climate modeling.

First a couple of local ones, in May:

Then, this one looks interesting: The World Climate Conference, in Geneva at the end of August. It looks like most of the program will be invited, but they will be accepting abstracts for a poster session. Given that the theme is to do with how climate information is generated and used, it sounds very appropriate.

Followed almost immediately by EnviroInfo2009, in Berlin, in September. I guess the field I want to name “Climate Informatics” would be a subfield of environmental informatics. Paper deadline is April 6.

Finally, there’s the biggy in Copenhagen in December, where, hopefully, the successor to the Kyoto agreement will be negotiated.

Here’s a challenge for the requirements modelling experts. I’ve phrased it as an exam question for my graduate course on requirements engineering (the course is on hiatus, which is lucky, because it would be a long exam…):

Q: The governments of all the nations on a small blue planet want to fix a problem with the way their reliance on fossil fuels is altering the planet’s climate. Draw a goal model (using any appropriate goal modeling notation) showing the key stakeholders, their interdependencies, and their goals. Be sure to show how the set of solutions they are considering contribute to satisfying their goals. The attached documents may be useful in answering this question: (a) A outline of the top level goals; (b) A description of the available solutions, characterized as a set of Stabilization Wedges; (c) A domain expert’s view of the feasbility of the solutions.

Update: Someone’s done the initial identification of actors already.

A group of us at the lab, led by Jon Pipitone, has been meeting every Tuesday lunchtime (well almost every Tuesday) for a few months, to brainstorm ideas for how software engineers can contribute to addressing the climate crisis. Jon has been blogging some of our sessions (here, here and here).

This week we attempted to create a matrix, where the rows are “challenge problems” related to the climate crisis, and the columns are the various research areas of software engineering (e.g. requirements analysis, formal methods, testing, etc…). One reason to do this is to figure out how to run a structured brainstorming session with a bigger set of SE researchers (e.g. at ICSE). Having sketched out the matrix, we then attempted to populate one row with ideas for research projects. I thought the exercise went remarkably well. One thing I took away from it was that it was pretty easy to think up research projects to populate many of the cells in the matrix (I had initially thought the matrix might be rather sparse by the time we were done).

We also decided that it would be helpful to characterize each of the rows a little more, so that SE researchers who are unfamiliar with some of the challenges would understand each challenge enough to stimulate some interesting discussions. So, here is an initial list of challenges (I added some links where I could). Note that I’ve grouped them according to who immediate audience is for any tools, techniques, practices…).

  1. Help the climate scientists to develop a better understanding of climate processes.
  2. Help the educators to to teach kids about climate science – how the science is done, and how we know what we know about climate change.
    • Support hands-on computational science (e.g. an online climate lab with building blocks to support construction of simple simulation models)
    • Global warming games
  3. Help the journalists & science writers to raise awareness of the issues around climate change for a broader audience.
    • Better public understanding of climate processes
    • Better public understanding of how climate science works
    • Visualizations of complex earth systems
    • connect data generators (eg scientists) with potential users (e.g. bloggers)
  4. Help the policymakers to design, implement and adjust a comprehensive set of policies for reducing greenhouse gas emissions.
  5. Help the political activists who put pressure on governments to change their policies, or to get better leaders elected when the current ones don’t act.
    • Social networking tools for activitists
    • Tools for persuasion (e.g. visualizations) and community building (e.g. Essence)
  6. Help individuals and communities to lower their carbon footprints.
  7. Help the engineers who are developing new technologies for renewable energy and energy efficiency systems.
    • green IT
    • Smart energy grids
    • waste reduction
    • renewable energy
    • town planning
    • green buildings/architecture
    • transportation systems (better public transit, electric cars, etc)
    • etc

Two related items, each occupying an interesting spot on my map of the “climate informatics” research space:

via grist, a wonderful five-part primer on Smart Energy Grids. Think of it as “the internet for energy transactions”. And think of all the interesting software engineering challenges in making it work.

via Steve Fickas, an interesting post on green clouds. And to think that I previously thought that cloud computing was deadly dull. Add an Energy Star rating to every application you run, and see what happens…

Here’s an updated description of the ICSE session I kicked off this blog with. Looks like we’re scheduled for the second morning afternoon of the conference (Thurs May 21, 11am 2pm), straight after the keynote.

Update: Slides and notes from the session now available.

Software Engineering for the Planet

This session is a call to action. What can we, as software engineers, do to help tackle the challenge of climate change (besides reducing our personal carbon footprints)? The session will review recent results from climate science, showing how big the challenge is. We will then identify ways in which software engineering tools and techniques can help. The goal is to build a research agenda and a community of software engineering researchers willing to pursue it.

The ICSE organisers have worked hard this year to make the conference “greener” – to reduce our impact on the environment. This is partly in response to the growing worldwide awareness that we need to take more care of the natural environment. But it is also driven by a deeper and more urgent concern.

During this century, we will have to face up to a crisis that will make the current economic turmoil look like a walk in the park. Climate change is accelerating, confirming the more pessimistic of scenarios identified by climate scientists [1-4]. Its effects will touch everything, including the flooding of low-lying lands and coastal cities, the disruption of fresh water supplies for much of the world, the loss of agricultural lands, more frequent and severe extreme weather events, mass extinctions, and the destruction of entire ecosystems [5].

And there are no easy solutions. We need concerted systematic change in how we live, to reduce emissions so as to stabilize the concentration of greenhouse gases that drive climate change. Not to give up the conveniences of modern life, but to re-engineer them so that we no longer depend on fossil fuels to power our lives. The challenge is massive and urgent – a planetary emergency. The type of emergency that requires all hands on deck. Scientists, engineers, policymakers, professionals, no matter what their discipline, need to ask how their skills and experience can contribute.

We, as software engineering researchers and software practitioners have many important roles to play. Our information systems help provide the data we need to support intelligent decision making, from individuals trying to reduce their energy consumption, to policymakers trying to design effective governmental policies. Our control systems allow us to make smarter use of the available power, and provide the  adaptability and reliability to power our technological infrastructure in the face of a more diverse set of renewable energy sources.

The ICSE community in particular has many other contributions to make. We have developed practices and tools to analyze, build and evolve some of the most complex socio-technical systems ever created, and to coordinate the efforts of large teams of engineers. We have developed abstractions that help us to understand complex systems, to describe their structure and behaviour, and to understand the effects of change on those systems. These tools and practices are likely to be useful in our struggle to address the climate crisis, often in strange and surprising ways. For example, can we apply the principles of information hiding and modularity to our attempts to develop coordinated solutions to climate change? What is the appropriate architectural pattern for an integrated set of climate policies? How can we model the problem requirements so that the stakeholders can understand them? How do we debug the models on which policy decision are based?

This conference session is intended to kick start a discussion about the contributions that software engineering research can make to tackling the climate crisis. Our aim is to build a community of concerned professionals, and find new ways to apply our skills and experience to the problem. We will attempt to map out a set of ideas for action, and identify potential roadblocks. We will start to build a broad research agenda, to capture the potential contributions of software engineering research, and discuss strategies for researchers to refocus their research towards this agenda. The session will begin with a short summary of the latest lessons from climate science, and a concrete set of examples of existing software engineering research efforts applied to climate change. We will include an open discussion session, to map out an agenda for action. We invite everyone to come to the session, and take up this challenge.

References:

[1] http://www.csmonitor.com/2006/0324/p01s03-sten.html

[2] http://www.newscientist.com/article/dn11083

[3] http://news.bbc.co.uk/2/hi/uk_news/7053903.stm

[4] http://www.pnas.org/content/104/24/10288.abstract

[5] http://www.ipcc.ch/ipccreports/ar4-wg2.htm

Over the last two years, evidence has accumulated that the IPCC reports released just two years ago underestimate the pace of climate change. Nature provides this summary. See also this article in Science Daily; and there are plenty more like it;

Emissions from fossil fuels growing faster than any of the scenarios included in the IPCC reports (news article ; original paper here). And recent studies indicate the effects are irreversible., at least for the next 1000 years.

Arctic Sea Ice, which is probably the most obvious “canary in the coal mine” is melting faster than the models predicted, and will likely never recover (Story from IPY here)

Greenland and Antarctic ice sheets melting 100 years ahead of schedule (news report; original papers here and here). Meanwhile new studies show the effect on the coastlines will be worse than previously thought, especially in North America and around the Indian Ocean (press release here; original paper here).

Sea level rise following the worst case scenario given in the IPCC reports (news report; original paper here and here).

Oceans soaking up less CO2, and hence losing their role as a carbon sink. (news report; original paper here

And finally some emerging evidence of massive methane releases as the permafrost melts (news report; no peer-reviewed paper yet).