As today is the deadline for proposing sessions for the AGU fall meeting in December, we’ve submitted a proposal for a session to explore open climate modeling and software quality. If we get the go ahead for the session, we’ll be soliciting abstracts over the summer. I’m hoping we’ll get a lively session going with lots of different perspectives.

I especially want to cover the difficulties of openness as well as the benefits, as we often hear a lot of idealistic talk on how open science would make everything so much better. While I think we should always strive to be more open, it’s not a panacea. There’s evidence that open source software isn’t necessarily better quality, and of course, there’re plenty of people using lack of openness as a political weapon, without acknowledging just how many hard technical problems there are to solve along the way, not least because there’s a lack of consensus over the meaning of openness among it’s advocates.

Anyway, here’s our session proposal:

TITLE: Climate modeling in an open, transparent world

AUTHORS (FIRST NAME INITIAL LAST NAME): D. A. Randall1, S. M. Easterbrook4, V. Balaji2, M. Vertenstein3

INSTITUTIONS (ALL): 1. Atmospheric Science, Colorado State University, Fort Collins, CO, United States. 2. Geophysical Fluid Dynamics Laboratory, Princeton, NJ, United States. 3. National Center for Atmospheric Research, Boulder, CO, United States. 4. Computer Science, University of Toronto, Toronto, ON, Canada.

Description: This session deals with climate-model software quality and transparent publication of model descriptions, software, and results. The models are based on physical theories but implemented as software systems that must be kept bug-free, readable, and efficient as they evolve with climate science. How do open source and community-based development affect software quality? What are the roles of publication and peer review of the scientific and computational designs in journals or other curated online venues? Should codes and datasets be linked to journal articles? What changes in journal submission standards and infrastructure are needed to support this? We invite submissions including experience reports, case studies, and visions of the future.


  1. Excellent. Might be a reason to go to AGU.

  2. Steve, there are now a number of results (from us and others) on the effect of open-ness, viz. “many eyeballs” on software quality. It’s not a straightforward effect: it can go both ways. In particular, I would be VERY cautious about reports from Coverity. They are a commercial outfit, after all, and do have a vested interest. Static analysis tools can have a lot of false positives, and even with true positives, the actual impact on field quality is far from clear. The last time I looked, Coverity in particular does not allow third-party evaluation of their tools.

  3. @Prem Devanbu : Excellent points, thanks! Oh and send papers!

  4. Maybe I will go to to AGU again…

  5. Pingback: Another Week of GW News, April 22, 2012 [A Few Things Ill Considered]

Join the discussion: