This week, I’m featuring some of the best blog posts written by the students on my first year undergraduate course, PMU199 Climate Change: Software, Science and Society. This post is by Harry, and it first appeared on the course blog on January 29.

Projections from global climate models indicate that continued 21st century increases in emissions of greenhouse gases will cause the temperature of the globe to increase by a few degrees. These global changes in a few degrees could have a huge impact on our planet. Whether a few global degrees cooler could lead to another ice age, a few global degrees warmer enables the world to witness more of nature’s most terrifying phenomenon.

According to Anthony D. Del Genio the surface of the earth heats up from sunlight and other thermal radiation, the amount of energy accumulated must be offset to maintain a stable temperature. Our planet does this by evaporating water that condenses and rises upwards with buoyant warm air. This removes any excess heat from the surface and into higher altitudes. In cases of powerful updrafts, the evaporated water droplets easily rise upwards, supercooling them to a temperature between -10 and -40°C. The collision of water droplets with soft ice crystals forms a dense mixture of ice pellets called graupel. The densities of graupel and ice crystals and the electrical charges they induce are two essential factors in producing what people see as lightning.

Ocean and land differences in updrafts also cause higher lightning frequencies. Over the course of the day, heat is absorbed by the oceans and hardly warms up. Land surfaces, on the other hand, cannot store heat and so they warm significantly from the beginning of the day. The great deal of the air above land surfaces is warmer and more buoyant than that over the oceans, creating strong convective storms as the warm air rises. The powerful updrafts, as a result of the convective storms, are more prone to generate lightning.

According to the general circulation model by Goddard Institute for Space Studies, one of the two experiments conducted indicates that a 4.2°C global warming suggests an increase of 30% in global lightning activity. The second experiment indicated that a 5.9°C global cooling would cause a 24% decrease in global lightning frequencies. The summaries of the experiments signifies a 5-6% change in global lightning frequency for every 1°C of global warming or cooling.

As 21st century projections of carbon dioxide and other greenhouse gases emission remain true, the earth continues to warm and the ocean evaporates more water. This is largely because the drier land surface is unable to evaporate water at the same extent as the oceans, causing the land to warm more. This should cause stronger convective storms and produce higher lightning occurrence.

Greater lightning frequencies can contribute to a warmer earth. Lightning provides an abundant source of nitrogen oxides, which is a precursor for ozone production in the troposphere. The presence of ozone in the upper troposphere acts as a greenhouse gas that absorbs some of the infrared energy emitted by earth. Because tropospheric ozone traps some of the escaping heat, the earth warms and the occurence of lightning is even greater. Lightning frequencies creates a positive feedback process on our climate system. The impact of ozone on the climate is much stronger than carbon, especially on a per-molecule basis, since ozone has a radiative forcing effect that is approximately 1,000 times as powerful as carbon dioxide. Luckily, the presence of ozone in the troposphere on a global scale is not as prevalent as carbon and its atmospheric lifetime averages to 22 days.

"Climate simulations, which were generated from four Global General Circulation Models (GCM), were used to project forest fire danger levels with relation to global warming."

Lightning occurs more frequently around the world, however lightning only affects a very local scale. The  local effect of lightning is what has the most impact on people. In the event of a thunderstorm, an increase in lightning frequencies places areas with high concentration of trees at high-risk of forest fire. Such areas in Canada are West-Central and North-western woodland areas where they pose as major targets for ignition by lightning. In fact, lightning accounted for 85% of that total area burned from 1959-1999. To preserve habitats for animals and forests for its function as a carbon sink, strenuous pressure on the government must be taken to ensure minimized forest fire in the regions. With 21st century estimates of increased temperature, the figure of 85% of area burned could dramatically increase, burning larger lands of forests. This is attributed to the rise of temperatures simultaneously as surfaces dry, producing more “fuel” for the fires.

Although lightning has negative effects on our climate system and the people, lightning also has positive effects on earth and for life. The ozone layer, located in the upper atmosphere, prevents ultraviolet light from reaching earth’s surface. Also, lightning causes a natural process known as nitrogen fixation. This process has a fundamental role for life because fixed nitrogen is required to construct basic building blocks of life (e.g. nucleotides for DNA and amino acids for proteins).

Lightning is an amazing and natural occurrence in our skies. Whether it’s a sight to behold or feared, we’ll see more of it as our earth becomes warmer.

This week, I’m featuring some of the best blog posts written by the students on my first year undergraduate course, PMU199 Climate Change: Software, Science and Society. The first is by Terry, and it first appeared on the course blog on January 28.

A couple of weeks ago, Professor Steve was talking about the extra energy that we are adding to the earth system during one of our sessions (and on his blog). He showed us this chart from the last IPCC report in 2007 that summarizes the various radiative forces from different sources:

Notice how aerosols account for most of the negative radiative forcing. But what are aerosols? What is their direct effect, their contribution in the cloud albedo effect, and do they have any other impact?

More »

Ever since I wrote about peak oil last year, I’ve been collecting references to “Peak X”. Of course, the key idea, Hubbert’s Peak applies to any resource extraction, where the resource is finite. So it’s not surprising that wikipedia now has entries on:

And here’s a sighting of a mention of Peak Gold.

Unlike peak oil, some of these curves can be dampened by the appropriate recycling. But what of stuff we normally think of as endlessly renewable:

  • Peak Water – it turns out that we haven’t been managing the world’s aquifers and lakes sustainably, despite the fact that that’s where our freshwater supplies come from (See Peter Gleick’s 2010 paper for a diagnosis and possible solutions)
  • Peak Food – similarly, global agriculture appears to be unsustainable, partly because food policy and speculation have wrecked local sustainable farming practices, but also because of population growth (See Jonathan Foley’s 2011 paper for a diagnosis and possible solutions).
  • Peak Fish – although overfishing is probably old news to everyone now.
  • Peak Biodiversity (although here it’s referred to as Peak Nature, which I think is sloppy terminology)

Which also leads to pressure on specific things we really care about, such as:

Then there is a category of things that really do need to peak:

And just in case there’s too much doom and gloom in the above, there are some more humorous contributions:

And those middle two, by the wonderful Randall Monroe make me wonder what he was doing here:

I can’t decide whether that last one is just making fun of the the singularity folks, or whether it’s a clever ruse to get people realize Hubbert’s Peak must kick in somewhere!

I was talking with Eric Yu yesterday about a project to use goal modeling as a way of organizing the available knowledge on how to solve a specific problem, and we thought that geo-engineering would make an interesting candidate to try this out on: It’s controversial, a number of approaches have been proposed, there are many competing claims made for them, and it’s hard to sort through such claims.

So, I thought I would gather together the various resources I have on geo-engineering:

Introductory Overviews:

Short commentaries:

Books:

Specific studies / papers:

Sometime in May, I’ll be running a new graduate course, DGC 2003 Systems Thinking for Global Problems. The course will be part of the Dynamics of Global Change graduate program, a cross-disciplinary program run by the Munk School of Global Affairs.

Here’s my draft description of the course:

The dynamics of global change are complex, and demand new ways of conceptualizing and analyzing inter-relationships between multiple global systems. In this course, we will explore the role of systems thinking as a conceptual toolkit for studying the inter-relationships between problems such as globalization, climate change, energy, health & wellbeing, and food security. The course will explore the roots of systems thinking, for example in General Systems Theory, developed by Karl Bertalanffy to study biological systems, and in Cybernetics, developed by Norbert Wiener to explore feedback and control in living organisms, machines, and organizations. We will trace this intellectual history to recent efforts to understand planetary boundaries, tipping points in the behaviour of global dynamics, and societal resilience. We will explore the philosophical roots of systems thinking as a counterpoint to the reductionism used widely across the natural sciences, and look at how well it supports multiple perspectives, trans-disciplinary synthesis, and computational modeling of global dynamics. Throughout the course, we will use global climate change as a central case study, and apply systems thinking to study how climate change interacts with many other pressing global challenges.

I’m planning to get the students to think about issues such as the principle of complementarity, and second-order cybernetics, and of course, how to understand the dynamics of non-linear systems, and the idea of leverage points. We’ll take a quick look at how earth system models work, but not in any detail, because it’s not intended to be physics or computing course; I’m expecting most of the students to be from political science, education, etc.

The hard part will be picking a good core text. I’m leaning towards Donnella Meadows’s book, Thinking in Systems, although I just received my copy of the awesome book Systems Thinkers, by Magnus Ramage and Karen Shipp (I’m proud to report that Magnus was once a student of mine!).

Anyway, suggestions for material to cover, books & papers to include, etc are most welcome.

Our paper on defect density analysis of climate models is now out for review at the journal Geoscientific Model Development (GMD). GMD is an open review / open access journal, which means the review process is publicly available (anyone can see the submitted paper, the reviews it receives during the process, and the authors’ response). If the paper is eventually accepted, the final version will also be freely available.

The way this works at GMD is that the paper is first published to Geoscientific Model Development Discussions (GMDD) as an un-reviewed manuscript. The interactive discussion is then open for a fixed period (in this case, 2 months). At that point the editors will make a final accept/reject decision, and, if accepted, the paper is then published to GMD itself. During the interactive discussion period, anyone can post comments on the paper, although in practice, discussion papers often only get comments from the expert reviewers commissioned by the editors.

One of the things I enjoy about the peer-review process is that a good, careful review can help improve the final paper immensely. As I’ve never submitted before to a journal that uses an open review process, I’m curious to see how the open reviewing will help – I suspect (and hope!) it will tend to make reviewers more constructive.

Anyway, here’s the paper. As it’s open review, anyone can read it and make comments (click the title to get to the review site):

Assessing climate model software quality: a defect density analysis of three models

J. Pipitone and S. Easterbrook
Department of Computer Science, University of Toronto, Canada

Abstract. A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

I was talking to the folks at our campus sustainability office recently, and they were extolling the virtues of Green Revolving Funds. The idea is ridiculously simple, but turns out to be an important weapon in making sure that the savings from  energy efficiency don’t just disappear back into the black hole of University operational budgets. Once the fund is set up, it provides money for the capital costs of energy efficiency projects, so that they don’t have to compete with other kinds of projects for scarce capital. The money saved from reduced utility bills is then ploughed back into the fund to support more such projects. And the beauty of the arrangement is that you don’t then have to go through endless bureaucracy to get new projects going. According to wikipedia, this arrangement is increasingly common across university campuses in the US and Canada.

So I’m delighted to see the Toronto District School Board (TDSB) is proposing a revolving fund for this too. Here’s the motion to be put to the TDSB’s Operations and Facilities Management Committee next week. Note the numbers in there about savings already realised:

Motion – Environmental Legacy Fund

Whereas in February 2010, the Board approved the Go Green: Climate Change Action Plan that includes the establishment of an Environmental Advisory Committee and the Environmental Legacy Fund;

Whereas the current balance of the Environmental Legacy Fund includes revenues from the sale of carbon credits accrued through energy efficiency projects and from Feed-in-Tariff payments accruing from nine Ministry-funded solar electricity projects;

Whereas energy efficiency retrofit projects completed since 1990/91 have resulted in an estimated 33.9% reduction in greenhouse gas emissions to date and lowered the TDSB’s annual operating costs significantly, saving the Board a $22.43 million in 2010/11 alone; and

Whereas significant energy efficiency and renewable energy opportunities remain available to the TDSB which can provide robust annual operational savings, new revenue streams as well as other benefits including increasing the comfort and health of the Board’s learning spaces;

Therefore, the Environmental Advisory Committee recommends that:

  1. The Environmental Legacy Fund be utilized in a manner that advances projects that directly and indirectly reduce the Board’s greenhouse gas (GHG) emissions and lower the TDSB’s long-term operating costs;
  2. The Environmental Legacy Fund be structured and operated as a revolving fund;
  3. The Environmental Legacy Fund be replenished and  augmented from energy cost savings achieved, incentives and grant revenues secured for energy retrofit projects and renewable energy projects, and an appropriate level of renewable energy revenue as determined by the Board.,
  4. The TDSB establish criteria for how and how much of the Environmental Legacy Fund can be used to advance environmental initiatives that have demonstrated GHG reduction benefits but may not provide a short-term financial return and opportunity for replenishing the Fund.
  5. To ensure transparency and document success, the Board issue an annual financial statement, on the Environmental Legacy Fund along with a report on the energy and GHG savings attributable to projects financed by the Fund.

The 12th Annual Weblog (Bloggies) awards shortlists are out. This year, they have merged the old categories of “Best Science weblog” and “Best Computer or Technology Weblog” into a single category, “Best Science or Technology Weblog“. And the five candidates on the shortlist? Four technology blogs and one rabid anti-science blog.

Not that this award ever had any track record for being able to distinguish science from pseudo-science; the award is legendary for vote-stuffing. But this year it has really stooped to new depths.

This term, I’m running my first year seminar course, “Climate Change: Software Science and Society” again. The outline has changed a little since last year, but the overall goals of the course are the same: to take a small, cross-disciplinary group of first year undergrads through some of the key ideas in climate modeling.

As last year, we’re running a course blog, and the first assignment is to write a blog post for it. Please feel free to comment on the students’ posts, but remember to keep your comments constructive!

Update (Aug 15, 2013): After a discussion on Twitter with Gavin Schmidt, I realised I did the calculation wrong. The reason is interesting: I’d confused radiative forcing with the current energy imbalance at the top of the atmosphere. A rookie mistake, but it shows that climate science can be tricky to understand, and it *really* helps to be able to talk to experts when you’re learning it… [I’ve marked the edits in green]

I’ve been meaning to do this calculation for ages, and finally had an excuse today, as I need it for the first year course I’m teaching on climate change. The question is: how much energy are we currently adding to the earth system due to all those greenhouse gases we’ve added to the atmosphere?

In the literature, the key concept is anthropogenic forcing, by which is meant the extent to which human activities are affecting the energy balance of the earth. When the Earth’s climate is stable, it’s because the planet is in radiative balance, meaning the incoming radiation from the sun and the outgoing radiation from the earth back into space are equal. A planet that’s in radiative balance will generally stay at the same (average) temperature because it’s not gaining or losing energy. If we force it out of balance, then the global average temperature will change.

Physicists express radiative forcing in watts per square meter (W/m2), meaning the number of extra watts of power that the earth is receiving, for each square meter of the earth’s surface. Figure 2.4 from the last IPCC report summarizes the various radiative forcings from different sources. The numbers show best estimates of the overall change from 1750 to 2005 (note the whiskers, which express uncertainty – some of these values are known much better than others):

If you add up the radiative forcing from greenhouse gases, you get a little over 2.5 W/m2. Of course, you also have to subtract the negative forcings from clouds and aerosols (tiny particles of pollution, such as sulpur dioxide), as these have a cooling effect because they block some of the incoming radiation from the sun. So we can look at the forcing that’s just due to greenhouse gases (about 2.5 W/m2), or we can look at the total net anthropogenic forcing that takes into account all the different effects (which is about 1.6 W/m2).

Over the period covered by the chart, 1750-2005, the earth warmed somewhat in response to this radiative forcing. The total incoming energy has increased by about +1.6W/m2, but the total outgoing energy lost to space has also risen – a warmer planet loses energy faster. The current imbalance between incoming and outgoing energy at the top of the atmosphere is therefore smaller than the total change in forcing over time. Hansen et. al. give an estimate of the energy imbalance of 0.58 ± 0.15 W/m2 for the period from 2005-2010.

The problem I have with these numbers is that they don’t mean much to most people. Some people try to explain it by asking people to imagine adding a 2 watt light bulb (the kind you get in Christmas lights) over each square meter of the planet, which is on continuously day and night. But I don’t think this really helps much, as most people (including me) do not have a good intuition for how many square meters the surface of Earth has, and anyway, we tend to think of a Christmas tree light bulb as using a trivially small amount of power. According to wikipedia, the Earth’s surface is 510 million square kilometers, which is 510 trillion square meters.

So, do the maths, that gives us a change in incoming energy of about 1,200 trillion watts (1.2 petawatts) for just the anthropogenic greenhouse gases, or about 0.8 petawatts overall when we subtract the cooling effect of changes in clouds and aerosols. But some of this extra energy is being lost back into space. From the current energy imbalance, the planet is gaining 0.3 petawatts at the moment.

But how big is a petawatt? A petawatt is 1015 watts. Wikipedia tells us that the average total global power consumption of the human world in 2010 was about 16 terawatts (1 petawatt = 1000 terawatts). So, human energy consumption is dwarfed by the extra energy currently being absorbed by the planet due to climate change: the planet is currently gaining about 18 watts of extra power for each 1 watt of power humans actually use.

Note: Before anyone complains, I’ve deliberately conflated energy and power above, because the difference doesn’t really matter for my main point. Power is work per unit of time, and is measured in watts; Energy is better expressed in joules, calories, or kilowatt hours (kWh). To be technically correct, I should say that the earth is getting about 300 terawatt hours of energy per hour due to anthropogenic climate change, and humans use about 16 terawatt hours of energy per hour. The ratio is still approximately 18.

Out of interest, you can also convert it to calories. 1kWh is about 0.8 million calories. So, we’re force-feeding the earth about 2 x 1017 (200,000,000,000,000,000) calories every hour. Yikes.

We’re running a new weekly lecture series this term to explore different disciplinary perspectives on climate change, entitled “Collaborative Challenges for the Climate Change Research Community“, sponsored by the department of Computer Science and the Centre for Environment. Our aim is to use this as an exploration of the range of research related to climate change across the University of Toronto, and to inspire new collaborations. A central theme of the series is the role of computational climate models: how researchers share models, verify models, create models, and share results. But we also want to explore beyond models, so we’ll be looking at ethics, policy, education, and community-based responses to climate change.

The lectures will be on Monday afternoons, at 3pm, starting on January 16th, in the Bahen Centre, 40 St George Street, Toronto, room BA1220. The lectures are public and free to attend.

The first four speakers have been announced (I’ll be giving the opening talk):

  • Jan 16th: Computing the Climate: the Evolution of Climate Models – Steve Easterbrook, Dept of Computer Science
  • Jan 23rd: Building Community Resilience: A Viable Response to Climate Change and Other Emerging Challenges to Health Equity? – Blake Poland, Dalla Lana School of Public Health
  • Jan 30th: Constraining fast and slow climate feedbacks with computer models – Danny Harvey, Dept of Geography
  • Feb 6th: Urban GHG Modelling Using Agent-Based Microsimulation – Eric Miller, Dept of Civil Engineering & Cities Centre

For more details, see the C4RC website.

One of the things that strikes me about discussions of climate change, especially from those who dismiss it as relatively harmless, is a widespread lack of understanding on how non-linear systems behave. Indeed, this seems to be one of the key characteristics that separate those who are alarmed at the prospect of a warming climate from those who are not.

At the AGU meeting this month, Kerry Emanuel presented a great example of this in his talk on “Hurricanes in a Warming Climate”. I only caught his talk by chance, as I was slipping out of the session in the next room, but I’m glad I did, because he made an important point about how we think about the impacts of climate change, and in particular, showed two graphs that illustrate the point beautifully.

Kerry’s talk was an overview of a new study that estimates changes in damage from tropical cyclones with climate change, using a new integrated assessment model. The results are reported in detail in a working paper at the World Bank. The report points out that the link between hurricanes and climate change remains controversial. So, while Atlantic hurricane power has more than doubled over the last 30 years, and model forecasts show an increase in the average intensity of hurricanes in a warmer world, there is still no clear statistical evidence of a trend in damages caused by these storms, and hence a great deal of uncertainty about future trends.

The analysis is complicated by several factors:

  • Increasing insurance claims from hurricane damage in the US have a lot to do with growing economic activity in vulnerable regions. Indeed, expected economic development in the regions subject to tropical storm damage means that there’s certain to be big increases in damage even if there were no warming at all.
  • The damage is determined more by when and where each storm makes landfall than it is by the intensity of the storm.
  • There simply isn’t enough data to detect trends. More than half of the economic damage due to hurricanes in the US since 1870 was caused by just 8 storms.

The new study by Emanuel and colleagues overcomes some of these difficulties by simulating large numbers of storms. They took the outputs of four different Global Climate Models, using the A1B emissions scenario, and fed them into a cyclone generator model to simulate thousands of storms, comparing the characteristics of these storms with those that have caused damage in the US in the last few decades, and then adjusting the damage estimates according to anticipated changes in population and economic activity in the areas impacted (for details, see the report).

The first thing to note is that the models forecast only a small change in hurricanes, typically a slight decrease in medium-strength storms and a slight increase in more intense storms. For example, at first sight, the MIROC model indicates almost no difference:

Probability density for storm damage on the US East Coast, generated from the MIROC model for current vs. year 2100, under the A1B scenario, for which this model forecasts a global average temperature increase of around 4.5C. Note that x axis is a logarithmic scale: 8 means $100 million, 9 means $1 billion, 10 means $10 billion, etc (source: Figure 9 in Mendelsohn et al, 2011)

Note particularly that at the peak of the graph, the model shows a very slight reduction in the number of storms (consistent with a slight decrease in the overall frequency of hurricanes), while on the upper tail, the model shows a very slight increase (consistent with a forecast that there’ll be more of the most intense storms). The other three models show slightly bigger changes by the year 2100, but overall, the graphs seem very comforting. It looks like we don’t have much to worry about (at least as far as hurricane damage from climate change is concerned). Right?

The problem is that the long tail is where all the action is. The good news is that there appears to be a fundamental limit on storm intensity, so the tail doesn’t really get much longer. But the problem is that it only takes a few more of these very intense storms to make a big difference in the amount of damage caused. Here’s what you get if you multiply the probability by the damage in the above graph:

Changing risk of hurricane damage due to climate change. Calculated as probability times impact. (Source: courtesy of K. Emanuel, from his AGU 2011 talk)

That tiny change in the long tail generates a massive change in the risk, because the system is non-linear. If most of the damage is done by a few very intense storms, then you only need a few more of them to greatly increase the damage. Note in particular, what happens at 12 on the damage scale – these are trillion dollar storms. [Update: Kerry points out that the total hurricane damage is proportional to the area under the curves of the second graph].

The key observation here is that the things that matter most to people (e.g. storm damage) do not change linearly as the climate changes. That’s why people who understand non-linear systems tend to worry much more about climate change than people who do not.

Here’s the call for papers for a workshop we’re organizing at ICSE next May:

The First International Workshop on Green and Sustainable Software (GREENS’2012)

(In conjunction with the 34th International Conference on Software Engineering (ICSE 2012), Zurich, Switzerland, June 2-9, 2012

Important Dates:

  • 17th February 2012 – paper submission
  • 19th March 2012 – notification of acceptance
  • 29th March 2012 – camera-ready
  • 3rd June 2011 – workshop

Workshop theme and goals: The Focus of the GREENS workshop is the engineering of green and sustainable software. Our goal is to bring together academics and practitioners to discuss research initiatives, challenges, ideas, and results in this critically important area of the software industry. To this end GREENS will both discuss the state of the practice, especially at the industrial level, and define a roadmap, both for academic research and for technology transfer to industry. GREENS seeks contributions addressing, but not limited to, the following list of topics:

Concepts and foundations:

  • Definition of sustainability properties (e.g. energy and power consumption, green-house gases emissions, waste and pollutants production), their relationships, their units of measure, their measurement procedures in the context of software-intensive systems, their relationships with other properties (e.g. response time, latency, cost, maintainability);
  • Green architectural knowledge, green IT strategies and design patterns;

Greening domain-specific software systems:

  • Energy-awareness in mobile software development;
  • Mobile software systems scalability in low-power situations;
  • Energy-efficient techniques aimed at optimizing battery consumption;
  • Large and ultra-large scale green information systems design and development (including inter-organizational effects)

Greening of IT systems, data and web centers:

  • Methods and approaches to improve sustainability of existing software systems;
  • Customer co-creation strategies to motivate behavior changes;
  • Virtualization and offloading;
  • Green policies, green labels, green metrics, key indicators for sustainability and energy efficiency;
  • Data center and storage optimization;
  • Analysis, assessment, and refactoring of source code to improve energy efficiency;
  • Workload balancing;
  • Lifecycle Extension

Greening the process:

  • Methods to design and develop greener software systems;
  • Managerial and technical risks for a sustainable modernization;
  • Quality & risk assessments, tradeoff analyses between energy efficiency, sustainability and traditional quality requirements;

Case studies, industry experience reports and empirical studies:

  • Empirical data and analysis about sustainability properties, at various granularity levels: complete infrastructure, or nodes of the infrastructure (PCs, servers, and mobile devices);
  • Studies to define technical and economic models of green aspects;
  • Return on investment of greening projects, reasoning about the triple bottom line of people, planet and profits;
  • Models of energy and power consumption, at various granularity levels;
  • Benchmarking of power consumption in software applications;

Guidelines for Submission: We are soliciting papers in two distinct categories:

  1. Research papers describing innovative and significant original research in the field (maximum 8 pages);
  2. Industrial papers describing industrial experience, case studies, challenges, problems and solutions (maximum 8 pages).

Please submit your paper online through EasyChair (see the GREENS website). Submissions should be original and unpublished work. Each submitted paper will undergo a rigorous review process by three members of the Program Committee. All types of papers must conform to the ICSE submission format and guidelines. All accepted papers will appear in the ACM Digital Library.

Workshop Organizers:

  • Patricia Lago (VU University Amsterdam, The Netherlands)
  • Rick Kazman (University of Hawaii, USA)
  • Niklaus Meyer (Green IT SIG, Swiss Informatics Society, Switzerland)
  • Maurizio Morisio (Politecnico di Torino, Italy)
  • Hausi A. Mueller (University of Victoria, Canada)
  • Frances Paulisch (Siemens Corporate Technology, Germany)
  • Giuseppe Scanniello (Università della Basilicata, Italy)
  • Olaf Zimmermann (IBM Research, Zurich, Switzerland)

Program committee:

  • Marco Aiello, University of Groningen, Netherlands
  • Luca Ardito, Politecnico di Torino, Italy
  • Ioannis Athanasiadis, Democritus Univ. of Thrace, Greece
  • Rami Bahsoon, University College London, UK
  • Ivica Crnkovic, Malardalen University, Sweden
  • Steve Easterbrook, University of Toronto, Canada
  • Hakan Erdogmus, Things Software
  • Anthony Finkelstein, University College London, UK
  • Matthias Galster, University of Groningen, Netherlands
  • Ian Gorton, Pacific Northwest National Laboratory, USA
  • Qing Gu, VU University Amsterdam, Netherlands
  • Wolfgang Lohmann, Informatics and Sustainability Research, Swiss Federal Laboratories for Materials Science and Technology, Switzerland
  • Lin Liu, School of Software, Tsinghua University, China
  • Alessandro Marchetto, Fondazione Bruno Kessler, Italy
  • Henry Muccini, University of L’Aquila, Italy
  • Stefan Naumann, Trier University of Applied Sciences, Environmental Campus, Germany
  • Cesare Pautasso, University of Lugano, Switzerland
  • Barbara Pernici, Politecnico di Milano, Italy
  • Giuseppe Procaccianti, Politecnico di Torino, Italy
  • Filippo Ricca, University of Genova
  • Antony Tang, Swinburne University of Tech., Australia
  • Antonio Vetro’, Fraunhofer IESE, USA
  • Joost Visser, Software Improvement Group and Knowledge Network Green Software, Netherlands
  • Andrea Zisman, City University London, UK

A number of sessions at the AGU meeting this week discussed projects to improve climate literacy among different audiences:

  • The Climate Literacy and Energy Awareness Network (CLEANET), are developing concept maps for use in middle school and high school, along with a large set of pointers to educational resources on climate and energy for use in the classroom.
  • The Climate Literacy Zoo Education Network (CliZEN). Michael Mann (of Hockey Stick Fame) talked about this project, which was a rather nice uplifting change from hearing about his experiences with political attacks on his work. This is a pilot effort, currently involving ten zoos, mainly in the north east US. So far, they have completed a visitor survey across a network of zoos, plus some aquaria, exploring the views of visitors on climate change, using the categories of the Six Americas report. The data they have collected show that zoo visitors tend to be more skewed towards the “alarmed” category compared to the general US population. [Incidentally, I’m impressed with their sample size: 3,558 responses. The original Six Americas study only had 981, and most surveys in my field have much smaller sample sizes]. The next steps in the project are to build on this audience analysis to put together targeted information and education material that links what we know about climate climate with it’s impact on specific animals at the zoos (especially polar animals).
  • The Climate Interpreter Project. Bill Spitzer from the New England Aquarium talked about this project. Bill points out that aquaria (and museums, zoos etc) are have an important role to play, because people come for the experience, which must be enjoyable, but they do expect to learn something, and they do trust museums and zoos to provide them accurate information. This project focusses on the role of interpreters and volunteers, who are important because they tend to be more passionate, more knowledgeable, and come into contact with many people. But many interpreters are not yet comfortable in talking about issues around climate change. They need help, and training. Interpretation isn’t just transmission of information. It’s about translating science in a way that’s meaningful and resonates with an audience. It requires a systems perspective. The strategy adopted by this project is to begin with audience research, to understand people’s interests and passions; connect this with the cognitive and social sciences on how people learn, and how they make sense of what they’re hearing; and finally to make use of strategic framing, which gets away from the ‘crisis’ frame that dominates most news reporting (on crime, disasters, fires), but which tends to leave people feeling overwhelmed, which leads them to treat it as someone else’s problem. Thinking explicitly about framing allows you to connect information about climate change with people’s values, with what they’re passionate about, and even with their sense of self-identity. The website climateinterpreter.org describes what they’ve learnt so far
    (As an aside, Bill points out that it can’t just be about training the interpreters – you need institutional support and leadership, if they are to focus on a controversial issue. Which got me thinking about why science museums tend to avoid talking much about climate change – it’s easy for the boards of directors to avoid the issue, because of worries about whether it’s politically sensitive, and hence might affect fundraising.)
  • The WorldViews Network. Rachel Connolly from Nova/WGBH presented this collaboration between museums, scientists and TV networks. Partners include planetariums and groups interested in data visualization, GIS data, mapping, many from an astronomy background. Their 3-pronged approach, called TPACK, identifies three types of knowledge: technological, pedagogical, and content knowledge.  The aim is to take people from seeing, to knowing, to doing. For example, they might start with a dome presentation, but bring into it live and interactive web resources, and then move on to community dialogues. Storylines use visualizations that move seamlessly across scale: cosmic, global, bioregional. They draw a lot on the ideas from Rockstrom’s planetary boundaries, within which they’r focussing on three: climate, biodiversity loss and ocean acidification. A recent example from Denver, in May, focussed on water. On the cosmic scale, they look at where water comes from as planets are formed. They eventually bring this down to the bioregional scale, looking at the rivershed for Denver, and the pressures on the Colorado River. Good visual design is a crucial part of the project (Rachel showed a neat example of a visualization of the size of water on the planet: comparing all water, with fresh water and frozen water. Another fascinating example was a satellite picture of the border of Egypt and Israel, where the different water management strategies either side of the border produce a starkly visible difference either side of the border. (Rachel also recommended Sciencecafes.org and the Buckminster Fuller Challenge).
  • ClimateCommunication.org. There was a lot of talk through the week about this project, led by Susan Hassol and Richard Somerville, especially their recent paper in Physics Today, which explores the use of jargon, and how it can mislead the general public. The paper went viral on the internet shortly after it was published, and and they used an open google doc to collect many more examples. Scientists are often completely unaware the non-specialists have different meaning for jargon terms, which can  then become a barrier for communication. My favourite examples from Susan’s list are “aerosol”, which to the public means a spray can (leading to a quip by Glenn Beck who had heard that aerosols cool the planet); ‘enhanced’, which the public understands as ‘made better’ so the ‘enhanced greenhouse effect’ sounds like a good thing, and ‘positive feedback’ which also sounds like a good thing, as it suggests a reward for doing something good.
  • Finally, slightly off topic, but I was amused by the Union of Concerned Scientists’ periodic table of political interferences in science.

On Thursday, Kaitlin presented her poster at the AGU meeting, which shows the results of the study she did with us in the summer. Her poster generated a lot of interest, especially the visualizations she has of the different model architectures. Click on thumbnail to see the full poster at the AGU site:

A few things to note when looking at the diagrams:

  • Each diagram shows the components of a model, scale to their relative size by lines of code. However, the models are not to scale with one another, as the smallest, UVic’s, is only a tenth of the size of the biggest, CESM. Someone asked what accounts for that size. Well, the UVic model is an EMIC rather than a GCM. It has a very simplified atmosphere model that does not include atmospheric dynamics, which makes it easier to run for very long simulations (e.g. to study paleoclimate). On the other hand, CESM is a community model, with a large number of contributors across the scientific community. (See Randall and Held’s point/counterpoint article in last months IEEE Software for a discussion of how these fit into different model development strategies).
  • The diagrams show the couplers (in grey), again sized according to number of lines of code. A coupler handles data re-gridding (when the scientific components use different grids), temporal aggregation (when the scientific components run on different time steps) along with other data handling. These are often invisible in diagrams the scientists create of their models, because they are part of the infrastructure code; however Kaitlin’s diagrams show how substantial they are in comparison with the scientific modules. The European models all use the same coupler, following a decade-long effort to develop this as a shared code resource.
  • Note that there are many different choices associated with the use of a coupler, as sometimes it’s easier to connect components directly rather through the coupler, and the choice may be driven by performance impact, flexibility (e.g. ‘plug-and-play’ compatibility) and legacy code issues. Sea ice presents an interesting example, because its extent varies over the course of a model run. So somewhere there must be code that keeps track of which grid cells have ice, and then routes the fluxes from ocean and atmosphere to the sea ice component for these grid cells. This could be done in the coupler, or in any of the three scientific modules. In the GFDL model, sea ice is treated as an interface to the ocean, so all atmosphere-ocean fluxes pass through it, whether there’s ice in a particular cell or not.
  • The relative size of the scientific components is a reasonable proxy for functionality (or, if you like, scientific complexity/maturity). Hence, the diagrams give clues about where each lab has placed its emphasis in terms of scientific development, whether by deliberate choice, or because of availability (or unavailability) of different areas of expertise. The differences between the models from different labs show some strikingly different choices here, for example between models that are clearly atmosphere-centric, versus models that have a more balanced set of earth system components.
  • One comment we received in discussions around the poster was about the places where we have shown sub-components in some of the models. Some modeling groups are more explicit about naming the sub-components, and indicating them in the code. Hence, our ability to identify these might be more dependent on naming practices rather than any fundamental architectural differences.

I’m sure Kaitlin will blog more of her reflections on the poster (and AGU in general) once she’s back home.