I’m attending a workshop this week in which some of the initial results from the Fifth Coupled Model Intercomparison Project (CMIP5) will be presented. CMIP5 will form a key part of the next IPCC assessment report – it’s a coordinated set of experiments on the global climate models built by labs around the world. The experiments include hindcasts to compare model skill on pre-industrial and 20th Century climate, projections into the future for 100 and 300 years, shorter term decadal projections, paleoclimate studies, plus lots of other experiments that probe specific processes in the models. (For more explanation, see the post I wrote on the design of the experiments for CMIP5 back in September).

I’ve been looking at some of the data for the past CMIP exercises. CMIP1 originally consisted of one experiment – a control run with fixed forcings. The idea was to compare how each of the models simulates a stable climate. CMIP2 included two experiments, a control run like CMIP1, and a climate change scenario in which CO2 levels were increased by 1% per year. CMIP3 then built on these projects with a much broader set of experiments, and formed a key input to the IPCC Fourth Assessment Report.

There was no CMIP4, as the numbers were resynchronised to match the IPCC report numbers (also there was a thing called the Coupled Carbon Cycle Climate Model Intercomparison Project, which was nicknamed C4MIP, so it’s probably just as well!), so CMIP5 will feed into the fifth assessment report.

So here’s what I have found so far on the vital statistics of each project. Feel free to correct my numbers and help me to fill in the gaps!

CMIP
(1996 onwards)
CMIP2
(1997 onwards)
CMIP3
(2005-2006)
CMIP5
(2010-2011)
Number of Experiments 1 2 12 110
Centres Participating 16 18 15 24
# of Distinct Models 19 24 21 45
# of Runs (Models X Expts) 19 48 211 841
Total Dataset Size 1 Gigabyte 500 Gigabyte 36 TeraByte 3.3 PetaByte
Total Downloads from archive ?? ?? 1.2 PetaByte
Number of Papers Published 47 595
Users ?? ?? 6700

[Update:] I’ve added a row for number of runs, i.e. the sum of the number of experiments run on each model (in CMIP3 and CMIP5, centres were able to pick a subset of the experiments to run, so you can’t just multiply models and experiments to get the number of runs). Also, I ought to calculate the total number of simulated years that represents (If a centre did all the CMIP5 experiments, I figure it would result in at least 12,000 simulated years).

Oh, one more datapoint from this week. We came up with an estimate that by 2020, each individual experiment will generate an Exabyte of data. I’ll explain how we got this number once we’ve given the calculations a bit more of a thorough checking over.

As today is the deadline for proposing sessions for the AGU fall meeting in December, we’ve submitted a proposal for a session to explore open climate modeling and software quality. If we get the go ahead for the session, we’ll be soliciting abstracts over the summer. I’m hoping we’ll get a lively session going with lots of different perspectives.

I especially want to cover the difficulties of openness as well as the benefits, as we often hear a lot of idealistic talk on how open science would make everything so much better. While I think we should always strive to be more open, it’s not a panacea. There’s evidence that open source software isn’t necessarily better quality, and of course, there’re plenty of people using lack of openness as a political weapon, without acknowledging just how many hard technical problems there are to solve along the way, not least because there’s a lack of consensus over the meaning of openness among it’s advocates.

Anyway, here’s our session proposal:

TITLE: Climate modeling in an open, transparent world

AUTHORS (FIRST NAME INITIAL LAST NAME): D. A. Randall1, S. M. Easterbrook4, V. Balaji2, M. Vertenstein3

INSTITUTIONS (ALL): 1. Atmospheric Science, Colorado State University, Fort Collins, CO, United States. 2. Geophysical Fluid Dynamics Laboratory, Princeton, NJ, United States. 3. National Center for Atmospheric Research, Boulder, CO, United States. 4. Computer Science, University of Toronto, Toronto, ON, Canada.

Description: This session deals with climate-model software quality and transparent publication of model descriptions, software, and results. The models are based on physical theories but implemented as software systems that must be kept bug-free, readable, and efficient as they evolve with climate science. How do open source and community-based development affect software quality? What are the roles of publication and peer review of the scientific and computational designs in journals or other curated online venues? Should codes and datasets be linked to journal articles? What changes in journal submission standards and infrastructure are needed to support this? We invite submissions including experience reports, case studies, and visions of the future.

This week, I’m featuring some of the best blog posts written by the students on my first year undergraduate course, PMU199 Climate Change: Software, Science and Society. This post is by Harry, and it first appeared on the course blog on January 29.

Projections from global climate models indicate that continued 21st century increases in emissions of greenhouse gases will cause the temperature of the globe to increase by a few degrees. These global changes in a few degrees could have a huge impact on our planet. Whether a few global degrees cooler could lead to another ice age, a few global degrees warmer enables the world to witness more of nature’s most terrifying phenomenon.

According to Anthony D. Del Genio the surface of the earth heats up from sunlight and other thermal radiation, the amount of energy accumulated must be offset to maintain a stable temperature. Our planet does this by evaporating water that condenses and rises upwards with buoyant warm air. This removes any excess heat from the surface and into higher altitudes. In cases of powerful updrafts, the evaporated water droplets easily rise upwards, supercooling them to a temperature between -10 and -40°C. The collision of water droplets with soft ice crystals forms a dense mixture of ice pellets called graupel. The densities of graupel and ice crystals and the electrical charges they induce are two essential factors in producing what people see as lightning.

Ocean and land differences in updrafts also cause higher lightning frequencies. Over the course of the day, heat is absorbed by the oceans and hardly warms up. Land surfaces, on the other hand, cannot store heat and so they warm significantly from the beginning of the day. The great deal of the air above land surfaces is warmer and more buoyant than that over the oceans, creating strong convective storms as the warm air rises. The powerful updrafts, as a result of the convective storms, are more prone to generate lightning.

According to the general circulation model by Goddard Institute for Space Studies, one of the two experiments conducted indicates that a 4.2°C global warming suggests an increase of 30% in global lightning activity. The second experiment indicated that a 5.9°C global cooling would cause a 24% decrease in global lightning frequencies. The summaries of the experiments signifies a 5-6% change in global lightning frequency for every 1°C of global warming or cooling.

As 21st century projections of carbon dioxide and other greenhouse gases emission remain true, the earth continues to warm and the ocean evaporates more water. This is largely because the drier land surface is unable to evaporate water at the same extent as the oceans, causing the land to warm more. This should cause stronger convective storms and produce higher lightning occurrence.

Greater lightning frequencies can contribute to a warmer earth. Lightning provides an abundant source of nitrogen oxides, which is a precursor for ozone production in the troposphere. The presence of ozone in the upper troposphere acts as a greenhouse gas that absorbs some of the infrared energy emitted by earth. Because tropospheric ozone traps some of the escaping heat, the earth warms and the occurence of lightning is even greater. Lightning frequencies creates a positive feedback process on our climate system. The impact of ozone on the climate is much stronger than carbon, especially on a per-molecule basis, since ozone has a radiative forcing effect that is approximately 1,000 times as powerful as carbon dioxide. Luckily, the presence of ozone in the troposphere on a global scale is not as prevalent as carbon and its atmospheric lifetime averages to 22 days.

"Climate simulations, which were generated from four Global General Circulation Models (GCM), were used to project forest fire danger levels with relation to global warming."

Lightning occurs more frequently around the world, however lightning only affects a very local scale. The  local effect of lightning is what has the most impact on people. In the event of a thunderstorm, an increase in lightning frequencies places areas with high concentration of trees at high-risk of forest fire. Such areas in Canada are West-Central and North-western woodland areas where they pose as major targets for ignition by lightning. In fact, lightning accounted for 85% of that total area burned from 1959-1999. To preserve habitats for animals and forests for its function as a carbon sink, strenuous pressure on the government must be taken to ensure minimized forest fire in the regions. With 21st century estimates of increased temperature, the figure of 85% of area burned could dramatically increase, burning larger lands of forests. This is attributed to the rise of temperatures simultaneously as surfaces dry, producing more “fuel” for the fires.

Although lightning has negative effects on our climate system and the people, lightning also has positive effects on earth and for life. The ozone layer, located in the upper atmosphere, prevents ultraviolet light from reaching earth’s surface. Also, lightning causes a natural process known as nitrogen fixation. This process has a fundamental role for life because fixed nitrogen is required to construct basic building blocks of life (e.g. nucleotides for DNA and amino acids for proteins).

Lightning is an amazing and natural occurrence in our skies. Whether it’s a sight to behold or feared, we’ll see more of it as our earth becomes warmer.

This week, I’m featuring some of the best blog posts written by the students on my first year undergraduate course, PMU199 Climate Change: Software, Science and Society. The first is by Terry, and it first appeared on the course blog on January 28.

A couple of weeks ago, Professor Steve was talking about the extra energy that we are adding to the earth system during one of our sessions (and on his blog). He showed us this chart from the last IPCC report in 2007 that summarizes the various radiative forces from different sources:

Notice how aerosols account for most of the negative radiative forcing. But what are aerosols? What is their direct effect, their contribution in the cloud albedo effect, and do they have any other impact?

More »

Ever since I wrote about peak oil last year, I’ve been collecting references to “Peak X”. Of course, the key idea, Hubbert’s Peak applies to any resource extraction, where the resource is finite. So it’s not surprising that wikipedia now has entries on:

And here’s a sighting of a mention of Peak Gold.

Unlike peak oil, some of these curves can be dampened by the appropriate recycling. But what of stuff we normally think of as endlessly renewable:

  • Peak Water – it turns out that we haven’t been managing the world’s aquifers and lakes sustainably, despite the fact that that’s where our freshwater supplies come from (See Peter Gleick’s 2010 paper for a diagnosis and possible solutions)
  • Peak Food – similarly, global agriculture appears to be unsustainable, partly because food policy and speculation have wrecked local sustainable farming practices, but also because of population growth (See Jonathan Foley’s 2011 paper for a diagnosis and possible solutions).
  • Peak Fish – although overfishing is probably old news to everyone now.
  • Peak Biodiversity (although here it’s referred to as Peak Nature, which I think is sloppy terminology)

Which also leads to pressure on specific things we really care about, such as:

Then there is a category of things that really do need to peak:

And just in case there’s too much doom and gloom in the above, there are some more humorous contributions:

And those middle two, by the wonderful Randall Monroe make me wonder what he was doing here:

I can’t decide whether that last one is just making fun of the the singularity folks, or whether it’s a clever ruse to get people realize Hubbert’s Peak must kick in somewhere!

I was talking with Eric Yu yesterday about a project to use goal modeling as a way of organizing the available knowledge on how to solve a specific problem, and we thought that geo-engineering would make an interesting candidate to try this out on: It’s controversial, a number of approaches have been proposed, there are many competing claims made for them, and it’s hard to sort through such claims.

So, I thought I would gather together the various resources I have on geo-engineering:

Introductory Overviews:

Short commentaries:

Books:

Specific studies / papers:

Sometime in May, I’ll be running a new graduate course, DGC 2003 Systems Thinking for Global Problems. The course will be part of the Dynamics of Global Change graduate program, a cross-disciplinary program run by the Munk School of Global Affairs.

Here’s my draft description of the course:

The dynamics of global change are complex, and demand new ways of conceptualizing and analyzing inter-relationships between multiple global systems. In this course, we will explore the role of systems thinking as a conceptual toolkit for studying the inter-relationships between problems such as globalization, climate change, energy, health & wellbeing, and food security. The course will explore the roots of systems thinking, for example in General Systems Theory, developed by Karl Bertalanffy to study biological systems, and in Cybernetics, developed by Norbert Wiener to explore feedback and control in living organisms, machines, and organizations. We will trace this intellectual history to recent efforts to understand planetary boundaries, tipping points in the behaviour of global dynamics, and societal resilience. We will explore the philosophical roots of systems thinking as a counterpoint to the reductionism used widely across the natural sciences, and look at how well it supports multiple perspectives, trans-disciplinary synthesis, and computational modeling of global dynamics. Throughout the course, we will use global climate change as a central case study, and apply systems thinking to study how climate change interacts with many other pressing global challenges.

I’m planning to get the students to think about issues such as the principle of complementarity, and second-order cybernetics, and of course, how to understand the dynamics of non-linear systems, and the idea of leverage points. We’ll take a quick look at how earth system models work, but not in any detail, because it’s not intended to be physics or computing course; I’m expecting most of the students to be from political science, education, etc.

The hard part will be picking a good core text. I’m leaning towards Donnella Meadows’s book, Thinking in Systems, although I just received my copy of the awesome book Systems Thinkers, by Magnus Ramage and Karen Shipp (I’m proud to report that Magnus was once a student of mine!).

Anyway, suggestions for material to cover, books & papers to include, etc are most welcome.

Our paper on defect density analysis of climate models is now out for review at the journal Geoscientific Model Development (GMD). GMD is an open review / open access journal, which means the review process is publicly available (anyone can see the submitted paper, the reviews it receives during the process, and the authors’ response). If the paper is eventually accepted, the final version will also be freely available.

The way this works at GMD is that the paper is first published to Geoscientific Model Development Discussions (GMDD) as an un-reviewed manuscript. The interactive discussion is then open for a fixed period (in this case, 2 months). At that point the editors will make a final accept/reject decision, and, if accepted, the paper is then published to GMD itself. During the interactive discussion period, anyone can post comments on the paper, although in practice, discussion papers often only get comments from the expert reviewers commissioned by the editors.

One of the things I enjoy about the peer-review process is that a good, careful review can help improve the final paper immensely. As I’ve never submitted before to a journal that uses an open review process, I’m curious to see how the open reviewing will help – I suspect (and hope!) it will tend to make reviewers more constructive.

Anyway, here’s the paper. As it’s open review, anyone can read it and make comments (click the title to get to the review site):

Assessing climate model software quality: a defect density analysis of three models

J. Pipitone and S. Easterbrook
Department of Computer Science, University of Toronto, Canada

Abstract. A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

I was talking to the folks at our campus sustainability office recently, and they were extolling the virtues of Green Revolving Funds. The idea is ridiculously simple, but turns out to be an important weapon in making sure that the savings from  energy efficiency don’t just disappear back into the black hole of University operational budgets. Once the fund is set up, it provides money for the capital costs of energy efficiency projects, so that they don’t have to compete with other kinds of projects for scarce capital. The money saved from reduced utility bills is then ploughed back into the fund to support more such projects. And the beauty of the arrangement is that you don’t then have to go through endless bureaucracy to get new projects going. According to wikipedia, this arrangement is increasingly common across university campuses in the US and Canada.

So I’m delighted to see the Toronto District School Board (TDSB) is proposing a revolving fund for this too. Here’s the motion to be put to the TDSB’s Operations and Facilities Management Committee next week. Note the numbers in there about savings already realised:

Motion – Environmental Legacy Fund

Whereas in February 2010, the Board approved the Go Green: Climate Change Action Plan that includes the establishment of an Environmental Advisory Committee and the Environmental Legacy Fund;

Whereas the current balance of the Environmental Legacy Fund includes revenues from the sale of carbon credits accrued through energy efficiency projects and from Feed-in-Tariff payments accruing from nine Ministry-funded solar electricity projects;

Whereas energy efficiency retrofit projects completed since 1990/91 have resulted in an estimated 33.9% reduction in greenhouse gas emissions to date and lowered the TDSB’s annual operating costs significantly, saving the Board a $22.43 million in 2010/11 alone; and

Whereas significant energy efficiency and renewable energy opportunities remain available to the TDSB which can provide robust annual operational savings, new revenue streams as well as other benefits including increasing the comfort and health of the Board’s learning spaces;

Therefore, the Environmental Advisory Committee recommends that:

  1. The Environmental Legacy Fund be utilized in a manner that advances projects that directly and indirectly reduce the Board’s greenhouse gas (GHG) emissions and lower the TDSB’s long-term operating costs;
  2. The Environmental Legacy Fund be structured and operated as a revolving fund;
  3. The Environmental Legacy Fund be replenished and  augmented from energy cost savings achieved, incentives and grant revenues secured for energy retrofit projects and renewable energy projects, and an appropriate level of renewable energy revenue as determined by the Board.,
  4. The TDSB establish criteria for how and how much of the Environmental Legacy Fund can be used to advance environmental initiatives that have demonstrated GHG reduction benefits but may not provide a short-term financial return and opportunity for replenishing the Fund.
  5. To ensure transparency and document success, the Board issue an annual financial statement, on the Environmental Legacy Fund along with a report on the energy and GHG savings attributable to projects financed by the Fund.

The 12th Annual Weblog (Bloggies) awards shortlists are out. This year, they have merged the old categories of “Best Science weblog” and “Best Computer or Technology Weblog” into a single category, “Best Science or Technology Weblog“. And the five candidates on the shortlist? Four technology blogs and one rabid anti-science blog.

Not that this award ever had any track record for being able to distinguish science from pseudo-science; the award is legendary for vote-stuffing. But this year it has really stooped to new depths.

This term, I’m running my first year seminar course, “Climate Change: Software Science and Society” again. The outline has changed a little since last year, but the overall goals of the course are the same: to take a small, cross-disciplinary group of first year undergrads through some of the key ideas in climate modeling.

As last year, we’re running a course blog, and the first assignment is to write a blog post for it. Please feel free to comment on the students’ posts, but remember to keep your comments constructive!

Update (Aug 15, 2013): After a discussion on Twitter with Gavin Schmidt, I realised I did the calculation wrong. The reason is interesting: I’d confused radiative forcing with the current energy imbalance at the top of the atmosphere. A rookie mistake, but it shows that climate science can be tricky to understand, and it *really* helps to be able to talk to experts when you’re learning it… [I’ve marked the edits in green]

I’ve been meaning to do this calculation for ages, and finally had an excuse today, as I need it for the first year course I’m teaching on climate change. The question is: how much energy are we currently adding to the earth system due to all those greenhouse gases we’ve added to the atmosphere?

In the literature, the key concept is anthropogenic forcing, by which is meant the extent to which human activities are affecting the energy balance of the earth. When the Earth’s climate is stable, it’s because the planet is in radiative balance, meaning the incoming radiation from the sun and the outgoing radiation from the earth back into space are equal. A planet that’s in radiative balance will generally stay at the same (average) temperature because it’s not gaining or losing energy. If we force it out of balance, then the global average temperature will change.

Physicists express radiative forcing in watts per square meter (W/m2), meaning the number of extra watts of power that the earth is receiving, for each square meter of the earth’s surface. Figure 2.4 from the last IPCC report summarizes the various radiative forcings from different sources. The numbers show best estimates of the overall change from 1750 to 2005 (note the whiskers, which express uncertainty – some of these values are known much better than others):

If you add up the radiative forcing from greenhouse gases, you get a little over 2.5 W/m2. Of course, you also have to subtract the negative forcings from clouds and aerosols (tiny particles of pollution, such as sulpur dioxide), as these have a cooling effect because they block some of the incoming radiation from the sun. So we can look at the forcing that’s just due to greenhouse gases (about 2.5 W/m2), or we can look at the total net anthropogenic forcing that takes into account all the different effects (which is about 1.6 W/m2).

Over the period covered by the chart, 1750-2005, the earth warmed somewhat in response to this radiative forcing. The total incoming energy has increased by about +1.6W/m2, but the total outgoing energy lost to space has also risen – a warmer planet loses energy faster. The current imbalance between incoming and outgoing energy at the top of the atmosphere is therefore smaller than the total change in forcing over time. Hansen et. al. give an estimate of the energy imbalance of 0.58 ± 0.15 W/m2 for the period from 2005-2010.

The problem I have with these numbers is that they don’t mean much to most people. Some people try to explain it by asking people to imagine adding a 2 watt light bulb (the kind you get in Christmas lights) over each square meter of the planet, which is on continuously day and night. But I don’t think this really helps much, as most people (including me) do not have a good intuition for how many square meters the surface of Earth has, and anyway, we tend to think of a Christmas tree light bulb as using a trivially small amount of power. According to wikipedia, the Earth’s surface is 510 million square kilometers, which is 510 trillion square meters.

So, do the maths, that gives us a change in incoming energy of about 1,200 trillion watts (1.2 petawatts) for just the anthropogenic greenhouse gases, or about 0.8 petawatts overall when we subtract the cooling effect of changes in clouds and aerosols. But some of this extra energy is being lost back into space. From the current energy imbalance, the planet is gaining 0.3 petawatts at the moment.

But how big is a petawatt? A petawatt is 1015 watts. Wikipedia tells us that the average total global power consumption of the human world in 2010 was about 16 terawatts (1 petawatt = 1000 terawatts). So, human energy consumption is dwarfed by the extra energy currently being absorbed by the planet due to climate change: the planet is currently gaining about 18 watts of extra power for each 1 watt of power humans actually use.

Note: Before anyone complains, I’ve deliberately conflated energy and power above, because the difference doesn’t really matter for my main point. Power is work per unit of time, and is measured in watts; Energy is better expressed in joules, calories, or kilowatt hours (kWh). To be technically correct, I should say that the earth is getting about 300 terawatt hours of energy per hour due to anthropogenic climate change, and humans use about 16 terawatt hours of energy per hour. The ratio is still approximately 18.

Out of interest, you can also convert it to calories. 1kWh is about 0.8 million calories. So, we’re force-feeding the earth about 2 x 1017 (200,000,000,000,000,000) calories every hour. Yikes.

We’re running a new weekly lecture series this term to explore different disciplinary perspectives on climate change, entitled “Collaborative Challenges for the Climate Change Research Community“, sponsored by the department of Computer Science and the Centre for Environment. Our aim is to use this as an exploration of the range of research related to climate change across the University of Toronto, and to inspire new collaborations. A central theme of the series is the role of computational climate models: how researchers share models, verify models, create models, and share results. But we also want to explore beyond models, so we’ll be looking at ethics, policy, education, and community-based responses to climate change.

The lectures will be on Monday afternoons, at 3pm, starting on January 16th, in the Bahen Centre, 40 St George Street, Toronto, room BA1220. The lectures are public and free to attend.

The first four speakers have been announced (I’ll be giving the opening talk):

  • Jan 16th: Computing the Climate: the Evolution of Climate Models – Steve Easterbrook, Dept of Computer Science
  • Jan 23rd: Building Community Resilience: A Viable Response to Climate Change and Other Emerging Challenges to Health Equity? – Blake Poland, Dalla Lana School of Public Health
  • Jan 30th: Constraining fast and slow climate feedbacks with computer models – Danny Harvey, Dept of Geography
  • Feb 6th: Urban GHG Modelling Using Agent-Based Microsimulation – Eric Miller, Dept of Civil Engineering & Cities Centre

For more details, see the C4RC website.

One of the things that strikes me about discussions of climate change, especially from those who dismiss it as relatively harmless, is a widespread lack of understanding on how non-linear systems behave. Indeed, this seems to be one of the key characteristics that separate those who are alarmed at the prospect of a warming climate from those who are not.

At the AGU meeting this month, Kerry Emanuel presented a great example of this in his talk on “Hurricanes in a Warming Climate”. I only caught his talk by chance, as I was slipping out of the session in the next room, but I’m glad I did, because he made an important point about how we think about the impacts of climate change, and in particular, showed two graphs that illustrate the point beautifully.

Kerry’s talk was an overview of a new study that estimates changes in damage from tropical cyclones with climate change, using a new integrated assessment model. The results are reported in detail in a working paper at the World Bank. The report points out that the link between hurricanes and climate change remains controversial. So, while Atlantic hurricane power has more than doubled over the last 30 years, and model forecasts show an increase in the average intensity of hurricanes in a warmer world, there is still no clear statistical evidence of a trend in damages caused by these storms, and hence a great deal of uncertainty about future trends.

The analysis is complicated by several factors:

  • Increasing insurance claims from hurricane damage in the US have a lot to do with growing economic activity in vulnerable regions. Indeed, expected economic development in the regions subject to tropical storm damage means that there’s certain to be big increases in damage even if there were no warming at all.
  • The damage is determined more by when and where each storm makes landfall than it is by the intensity of the storm.
  • There simply isn’t enough data to detect trends. More than half of the economic damage due to hurricanes in the US since 1870 was caused by just 8 storms.

The new study by Emanuel and colleagues overcomes some of these difficulties by simulating large numbers of storms. They took the outputs of four different Global Climate Models, using the A1B emissions scenario, and fed them into a cyclone generator model to simulate thousands of storms, comparing the characteristics of these storms with those that have caused damage in the US in the last few decades, and then adjusting the damage estimates according to anticipated changes in population and economic activity in the areas impacted (for details, see the report).

The first thing to note is that the models forecast only a small change in hurricanes, typically a slight decrease in medium-strength storms and a slight increase in more intense storms. For example, at first sight, the MIROC model indicates almost no difference:

Probability density for storm damage on the US East Coast, generated from the MIROC model for current vs. year 2100, under the A1B scenario, for which this model forecasts a global average temperature increase of around 4.5C. Note that x axis is a logarithmic scale: 8 means $100 million, 9 means $1 billion, 10 means $10 billion, etc (source: Figure 9 in Mendelsohn et al, 2011)

Note particularly that at the peak of the graph, the model shows a very slight reduction in the number of storms (consistent with a slight decrease in the overall frequency of hurricanes), while on the upper tail, the model shows a very slight increase (consistent with a forecast that there’ll be more of the most intense storms). The other three models show slightly bigger changes by the year 2100, but overall, the graphs seem very comforting. It looks like we don’t have much to worry about (at least as far as hurricane damage from climate change is concerned). Right?

The problem is that the long tail is where all the action is. The good news is that there appears to be a fundamental limit on storm intensity, so the tail doesn’t really get much longer. But the problem is that it only takes a few more of these very intense storms to make a big difference in the amount of damage caused. Here’s what you get if you multiply the probability by the damage in the above graph:

Changing risk of hurricane damage due to climate change. Calculated as probability times impact. (Source: courtesy of K. Emanuel, from his AGU 2011 talk)

That tiny change in the long tail generates a massive change in the risk, because the system is non-linear. If most of the damage is done by a few very intense storms, then you only need a few more of them to greatly increase the damage. Note in particular, what happens at 12 on the damage scale – these are trillion dollar storms. [Update: Kerry points out that the total hurricane damage is proportional to the area under the curves of the second graph].

The key observation here is that the things that matter most to people (e.g. storm damage) do not change linearly as the climate changes. That’s why people who understand non-linear systems tend to worry much more about climate change than people who do not.

Here’s the call for papers for a workshop we’re organizing at ICSE next May:

The First International Workshop on Green and Sustainable Software (GREENS’2012)

(In conjunction with the 34th International Conference on Software Engineering (ICSE 2012), Zurich, Switzerland, June 2-9, 2012

Important Dates:

  • 17th February 2012 – paper submission
  • 19th March 2012 – notification of acceptance
  • 29th March 2012 – camera-ready
  • 3rd June 2011 – workshop

Workshop theme and goals: The Focus of the GREENS workshop is the engineering of green and sustainable software. Our goal is to bring together academics and practitioners to discuss research initiatives, challenges, ideas, and results in this critically important area of the software industry. To this end GREENS will both discuss the state of the practice, especially at the industrial level, and define a roadmap, both for academic research and for technology transfer to industry. GREENS seeks contributions addressing, but not limited to, the following list of topics:

Concepts and foundations:

  • Definition of sustainability properties (e.g. energy and power consumption, green-house gases emissions, waste and pollutants production), their relationships, their units of measure, their measurement procedures in the context of software-intensive systems, their relationships with other properties (e.g. response time, latency, cost, maintainability);
  • Green architectural knowledge, green IT strategies and design patterns;

Greening domain-specific software systems:

  • Energy-awareness in mobile software development;
  • Mobile software systems scalability in low-power situations;
  • Energy-efficient techniques aimed at optimizing battery consumption;
  • Large and ultra-large scale green information systems design and development (including inter-organizational effects)

Greening of IT systems, data and web centers:

  • Methods and approaches to improve sustainability of existing software systems;
  • Customer co-creation strategies to motivate behavior changes;
  • Virtualization and offloading;
  • Green policies, green labels, green metrics, key indicators for sustainability and energy efficiency;
  • Data center and storage optimization;
  • Analysis, assessment, and refactoring of source code to improve energy efficiency;
  • Workload balancing;
  • Lifecycle Extension

Greening the process:

  • Methods to design and develop greener software systems;
  • Managerial and technical risks for a sustainable modernization;
  • Quality & risk assessments, tradeoff analyses between energy efficiency, sustainability and traditional quality requirements;

Case studies, industry experience reports and empirical studies:

  • Empirical data and analysis about sustainability properties, at various granularity levels: complete infrastructure, or nodes of the infrastructure (PCs, servers, and mobile devices);
  • Studies to define technical and economic models of green aspects;
  • Return on investment of greening projects, reasoning about the triple bottom line of people, planet and profits;
  • Models of energy and power consumption, at various granularity levels;
  • Benchmarking of power consumption in software applications;

Guidelines for Submission: We are soliciting papers in two distinct categories:

  1. Research papers describing innovative and significant original research in the field (maximum 8 pages);
  2. Industrial papers describing industrial experience, case studies, challenges, problems and solutions (maximum 8 pages).

Please submit your paper online through EasyChair (see the GREENS website). Submissions should be original and unpublished work. Each submitted paper will undergo a rigorous review process by three members of the Program Committee. All types of papers must conform to the ICSE submission format and guidelines. All accepted papers will appear in the ACM Digital Library.

Workshop Organizers:

  • Patricia Lago (VU University Amsterdam, The Netherlands)
  • Rick Kazman (University of Hawaii, USA)
  • Niklaus Meyer (Green IT SIG, Swiss Informatics Society, Switzerland)
  • Maurizio Morisio (Politecnico di Torino, Italy)
  • Hausi A. Mueller (University of Victoria, Canada)
  • Frances Paulisch (Siemens Corporate Technology, Germany)
  • Giuseppe Scanniello (Università della Basilicata, Italy)
  • Olaf Zimmermann (IBM Research, Zurich, Switzerland)

Program committee:

  • Marco Aiello, University of Groningen, Netherlands
  • Luca Ardito, Politecnico di Torino, Italy
  • Ioannis Athanasiadis, Democritus Univ. of Thrace, Greece
  • Rami Bahsoon, University College London, UK
  • Ivica Crnkovic, Malardalen University, Sweden
  • Steve Easterbrook, University of Toronto, Canada
  • Hakan Erdogmus, Things Software
  • Anthony Finkelstein, University College London, UK
  • Matthias Galster, University of Groningen, Netherlands
  • Ian Gorton, Pacific Northwest National Laboratory, USA
  • Qing Gu, VU University Amsterdam, Netherlands
  • Wolfgang Lohmann, Informatics and Sustainability Research, Swiss Federal Laboratories for Materials Science and Technology, Switzerland
  • Lin Liu, School of Software, Tsinghua University, China
  • Alessandro Marchetto, Fondazione Bruno Kessler, Italy
  • Henry Muccini, University of L’Aquila, Italy
  • Stefan Naumann, Trier University of Applied Sciences, Environmental Campus, Germany
  • Cesare Pautasso, University of Lugano, Switzerland
  • Barbara Pernici, Politecnico di Milano, Italy
  • Giuseppe Procaccianti, Politecnico di Torino, Italy
  • Filippo Ricca, University of Genova
  • Antony Tang, Swinburne University of Tech., Australia
  • Antonio Vetro’, Fraunhofer IESE, USA
  • Joost Visser, Software Improvement Group and Knowledge Network Green Software, Netherlands
  • Andrea Zisman, City University London, UK