I spent some time this week explaining to my undergraduate class the ideas of thermal equilibrium (loosely speaking, the point at which the planet’s incoming solar radiation and outgoing blackbody radiation are in balance) and climate sensitivity (loosely speaking, how much warmer the earth will get per doubling of CO2, until it reaches a new equilibrium). I think some of my students might prefer me to skip the basic physics, and get on quicker to the tough questions of what solutions there are to climate change, whether geo-engineering will work, and the likely impacts around the world.
So it’s nice to be reminded that a good grasp of the basic science is important. A study produced by the Argentinean group Federacion Ecologia Universal, and published on the American Association for Advancement of Science website, looked at the likely impact of climate change on global food supplies by the year 2020, concluding that global food prices will rise by iup to 20%, and some countries, such as India, will see crop yields drop as much as 30%. The study claims to have used IPCC temperature projections of up to 2.4°C rise in global average temperatures by 2020 on a business-as-usual scenario.
The trouble is the IPCC doesn’t have temperature projections anywhere near this high for 2020. As Scott Mandia explains, it looks like the author of the report made a critical (but understandable) mistake, confusing the two ways of understanding ‘climate sensitivity’:
- Equilibrium Climate Sensitivity (ECS), which means overall eventual global temperature rise that would result if we double the level of CO2 concentrations in the atmosphere.
- Transient Climate Response (TCR), which means the actual temperature rise the planet will have experienced at the time this doubling happens.
These are different quantities because of lags in the system. It takes many years (perhaps decades) for the earth to reach a new equilibrium whenever we increase the concentrations of greenhouse gases, because most of the extra energy is initially absorbed by the oceans, and it takes a long time for the oceans and atmosphere to settle into a new balance. By global temperature, scientists normally mean the average air temperature measured just above the surface (which is probably where temperature matters most to humans).
BTW, calculating the temperature rise “per doubling of CO2” make sense because the greenhouse effect is roughly logarithmic – each doubling produces about the same temperature rise. So for example, the pre-industrial concentration of CO2 was about 280ppm (parts per million). So a doubling would take us to 560ppm (we’re currently at 390ppm).
To estimate how quickly the earth will warm, and where the heat might go, we need good models of how the earth systems (ocean, atmosphere, ice sheets, land surfaces) move heat around. In earth system models, the two temperature responses are estimated from two different types of experiment:
- equilibrium climate sensitivity is calculated by letting CO2 concentrations rise steadily over a number of years, until they reach double the pre-industrial levels. They are then held steady after this point, and the run continues until the global temperature stops changing.
- transient climate
sensitivityresponse is calculated by increasing CO2 concentrations by 1% per year, until they reach double the pre-industrial levels, and taking the average temperature at that point.
Both experiments are somewhat unrealistic, and should be thought of more as thought experiments rather than predictions. For example, in the equilibrium experiment, it’s unlikely that CO2 concentrations would stop rising and then remain constant from that point on. In the transient experiment, the annual rise of 1% a little unrealistic – CO2 concentrations rose by a less than 1% per year over the last decade. On the other hand, knowing the IPCC figures for equilibrium sensitivity tells you very little about the eventual temperature change if (when) we do reach 560ppm, because when we reach that level, it’s unlikely we’ll be able to prevent CO2 concentrations going even higher still.
Understanding all this matters for many reasons. If people confuse the two types of sensitivity, they’ll misunderstand what temperature changes are likely to happen when. More importantly, failure to understand these ideas means a failure to understand the lags in the system:
- there’s a lag of decades between increasing greenhouse gas concentrations and the eventual temperature response. In other words, we’re always owed more warming than we’ve had. Even if we stopped using fossil fuels immediately, temperatures would still rise for a while.
- there’s another lag also decades long, between peak emissions and peak concentrations. If we get greenhouse gas emissions under control and then start to reduce them, atmospheric concentrations will continue to rise for as long as the emissions exceed the rate of natural removal of CO2 from the atmosphere.
- there’s another lag (and evidence shows it’s also decades long) between humans realising climate change is a serious problem, and any coordinated attempts to do something about it.
- and yet another lag (probably also decades long, hopefully shorter) between the time we implement any serious international climate policies and the point at which we reach peak emissions, because it will take a long time to re-engineer the world’s energy infrastructure to run on non-fossil fuel energy.
Add up these lags, and it becomes apparent that climate change is a problem that will stretch most people’s imaginations. We’re not used to having to having to plan decades ahead, and we’re not used to the idea that any solution will take decades before it starts to make a difference.
And of course, if people who lie about climate change for a living merely say “ha, ha, a scientist made a mistake so global warming must be a myth!” we’ll never get anywhere. Indeed, we may even have already caused the impacts on food supply described in the withdrawn report. It’s just that it’s likely to take longer than 2020 before we see them played out.
(I am a scientist familiar with this subject before IPCC started.)
As I recollect, scientists do not usually call transient climate reponse “transient climate sensitivity”, though they do call such design of experiments that uses idealized forcing as “sensitivity experiments”. A vague notion of “transient sensitivity” seems to exist, but it is not so well defined that a simple number can be attached.
On the other hand, before IPCC, climate sensitivity did not always mean to response to CO2 concentration. I remember I used the term to mean the equilibrium difference of global mean surface air temperature with respect to fractional change of the “solar constant” (in a text published in Japanese in 1993), though I do not think this usage universal then either. We recently see some scientists saying that climate sensitivity to CO2 concentration should have similar values to climate sensitivity to solar forcing. In such cases, they probably consider energy flux density at the tropopause as the common scale of forcing.
By the way, “flux” is another term that requires explanation in communication across disciplinary boundaries. I intentionally wrote “energy flux density” in the previous comment, as I try to use the terminology common in general science. But in climate science it is usually called “energy flux”, and this disciplinary habit is explicitly adopted by the CF Convention. I agree that it is impractical to say “flux density” so many times.
“Thermal equilibrium” is also a problematic term!
The climate system is not in a thermodynamic equilibrium, but it is (approximately) in a non-equilibrium steady state, if we use terminology in thermodynamics.
Manabe and Strickler (1964) used the term “thermal equilibrium” as a synonym of “radiative-convective equilibrium”, where energy fluxes are balanced. It is in a context of the 1st law of thermondynamics.
The most staunch denier of anthoropogenic global warming in Japan, Atsushi Tsuchida, is a physicist. In 1970s he estabilished the concept that the climate system is in a non-equilibrium steady state, which I think correct. In 2000s he says that Manabe’s model cannot be true because the climate system is not in a thermodynamic equilibrium even in an approximate sense. I think this is just misunderstanding of terminology. But it seems impossible to change his mind.
Are there any figures out there estimating the time lag between transient & equilibrium conditions? That is, how many years of warming are in the pipeline?
Masuda-san: Many thanks for the comments about terminology. I checked the IPCC reports again, and they are careful not to call the transient response ‘sensitivity’. On the other hand I see it used this way a lot in the more general literature (e.g. the wikipedia page for climate sensitivity uses the word ‘sensitivity’ for both). I’ve fixed my post anyway, to avoid causing any more confusion.
There’s a bigger story here about terminology, and how it causes misunderstandings even between closely related scientific fields (let along when communicating with non-technical audiences). I feel another blog post coming on….
Kate: I looked again at chapter 10 of the IPCC AR4 WG1 report, and didn’t find anything specific about the length of the lag. So I’ll have to dig deeper (he says, hoping someone else will post an answer to save him the trouble looking…)
According to Gavin Schmidt at RealClimate (http://www.realclimate.org/index.php/archives/2011/01/getting-things-right/ ), another misunderstanding of terminology (“CO2 equivalent”) was also involved in the over-estimation by F.U.E.
***
As for Kate’s question, I do not have a direct answer.
But I do teach the concept of transient climate response. I usually cite the following “classic” paper. When I learned about the study it was science in action. I feel like a very old person.
M.J. Spelman and S. Manabe, 1984: Influence of oceanic heat transport upon the sensitivity of a model climate, Journal of Geophysical Research, 89(C1):571-586.
The study was a part of probably the first transient climate response experiment with coupled ocean-atmosphere general circulation model. (There are other papers by Manabe et al. and Bryan et al.) They used idealized land-sea configuration (to save computer resource and to have clearer theoretical insight). They first calculated steady states at the standard CO2 concentration and quadruplied CO2 concentration. Then they made a transient experiment where CO2 concentration was instantaneously quadruplied from the standard value. The warming of air temperature go toghther with the temperature of the upper ocean (up to 500 m depth if very crudely summarized), and they reached approximately 70% of the equilibrium response in about 30 years. Response of deeper ocean is much slower and it seemed to take one thousand years to warm up.
This experiment differs from reality in many aspects, of course. But I think its broad conclusion that “the global warming delays by several decades due to heat capacity of the upper ocean” is valid as the current understanding of the real world.
As for description in a textbook, the transient climate response is the subject of Section 12.6 of the following book. (To understand it you probably need to read Chapter 9.)
Dennis L. Hartmann, 1994: Global Physical Climatology. Academic Press. (The author’s page is at http://www.atmos.washington.edu/~dennis/gpc.html ).
This is a textbook was intended to be used in upper level of undergraduate courses and basic graduate courses. So I am afraid Kate may not be ready to understand it, but I think this is something like a target she should eventually understand.
@Kate ,
A few years to a few decades is one answer, but it’s subtle.
One subtlety is that both the response time of the climate and the response time of the carbon cycle are involved. The question is usually phrased, “If we cut emissions to zero, how long will we expect temperatures to rise?” (in terms of some e-folding time). But the answer depends on whether you think that means CO2 concentrations will immediately stabilize, or (more realistically) whether you think they will in fact decrease, due to natural carbon sinks.
A classic reference on climate response times is Hansen et al. (1985), and some related work in Wigley and Schlesinger (1985).
Skipping ahead a few decades, in the AR4 report projections, you can usually find a curve labeled something like “climate commitment”, which shows how fast, and how much, warming is expected due to instantly eliminating emissions. Solomon et al. (2009) and Solomon et al. (2010) discuss this in passing, on the way to discussing the longevity of the resulting commitment. But Matthews and Weaver (2010) discusses the different answers you get depending on whether you include an interactive carbon cycle (the subtlety mentioned above). CO2 isn’t the only consideration, either. The second Solomon paper discusses non-CO2 GHGs, and Armour and Roe (2011) discuss how the aerosol radiative budget would influence the transient temperature commitment curves in a zero-emissions scenario.
A second subtlety is that there isn’t exactly just one single response time, as discussed in, for example, Held et al. (2010) (though they consider an abrupt return to preindustrial forcing rather than an abrupt elimination of emissions, which does not instantly remove the forcing itself).
I’m certainly skipping over many important papers in between, but these are the ones I thought of off the top of my head.
Pingback: Climate sensitivity as an accordion | Serendipity