In honour of today’s announcement that Syukuro Manabe, Klaus Hasselmann and Giorgio Parisi have been awarded the Nobel prize in physics for their contributions to understanding and modeling complex systems, I’m posting here some extracts from my forthcoming book, “Computing the Climate”, describing Manabe’s early work on modeling the climate system. We’ll start the story with the breakthrough by Norman Phillips at Princeton University’s Institute for Advanced Study (IAS), which I wrote about in my last post.

The Birth of General Circulation Modeling

Phillips had built what we now acknowledge as the first general circulation model (GCM), in 1955. It was ridiculously simple, representing the earth as a cylinder rather than a globe, with the state of the atmosphere expressed using a single variable—air pressure—at two different heights, at each of 272 points around the planet (a grid of 16 x 17 points). Despite its simplicity, Phillips’ model did something remarkable. When started with a uniform atmosphere—the same values at every grid point—the model gradually developed its own stable jet stream, under the influence of the equations that describe the effect of heat from the sun and rotation of the earth. The model was hailed as a remarkable success, and inspired a generation of atmospheric scientists to develop their own global circulation models.

The idea of starting the model with the atmosphere at rest—and seeing what patterns emerge—is a key feature that makes this style of modelling radically different how models are used in weather forecasting. Numerical weather forecasting had taken off rapidly, and by 1960, three countries—the United States, Sweden and Japan—had operational numerical weather forecasting services up and running. So there was plenty of expertise already in numerical methods and computational modelling among the meteorological community, especially in those three countries. 

But whereas a weather model only simulates a few days starting from data about current conditions, a general circulation model has to simulate long-term stable patterns, which means many of the simplifications to the equations of motion that worked in early weather forecasting models don’t work in GCMs. The weather models of the 1950s all ignore fast moving waves that are irrelevant in short-term weather forecasts. But these simplifications make the model unstable over longer runs. The atmosphere would steadily lose energy—and sometimes air and moisture too—so that realistic climatic patterns don’t emerge. The small group of scientists interested in general circulation modelling began to diverge from the larger numerical weather forecasting community, choosing to focus on versions of the equations and numerical algorithms with conservation of mass and energy built in, to give stable long-range simulations.

In 1955, the US Weather Bureau established a General Circulation Research Laboratory, specifically to build on Phillips’ success. It was headed by Joseph Smagorinsky, one of the original members of the ENIAC weather modelling team. Originally located just outside Washington DC, the lab has undergone several name changes and relocations, and is now the Geophysical Fluid Dynamics Lab (GFDL), housed at Princeton University, where it remains a major climate modelling lab today.

In 1959, Smagorinsky recruited the young Japanese meteorologist, Syukuro Manabe from Tokyo, and they began work on a primitive equation model. Like Phillips, they began with a model that represented only one hemisphere. Manabe concentrated on the mathematical structure of the models, while Smagorinsky hired a large team of programmers to develop the code. By 1963, they had developed a nine-layer atmosphere model which exchanged water—but not heat—between the atmosphere and surface. The planet’s surface, however, was flat and featureless—a continuous swamp from which water could evaporate, but which had no internal dynamics of its own. The model could simulate radiation passing through the atmosphere, interacting with water vapour, ozone and CO2. Like most of the early GCMs, this model captured realistic global patterns, but had many of the details wrong. 

Meanwhile, at the University of California, Los Angeles (UCLA), Yale Mintz, the associate director of the Department of Meteorology, recruited another young Japanese meteorologist, Akio Arakawa, to help him build their own general circulation model. From 1961, Mintz and Arakawa developed a series of models, with Mintz providing the theoretical direction, and Arakawa designing the model, with help from the department’s graduate students. By 1964, their model represented the entire globe with a 2-layer atmosphere and realistic geography.

Computational limitations dominated the choices these two teams had to make. For example, the GFDL team modelled only the northern hemisphere, with a featureless surface, so that they could put more layers into the atmosphere, while the UCLA team chose the opposite route: an entire global model with realistic layout of continents and oceans, but with only 2 layers of atmosphere.

Early Warming Signals

Meanwhile, in the early 1950s, oceanographers at the Scripps Institute for Oceanography in California, under the leadership of their new director, Roger Revelle, were investigating the spread of radioactive fallout in the oceans from nuclear weapons testing. Their work was funded by the US military, who needed to know how quickly the oceans would absorb these contaminants, to assess the risks to human health. But Revelle had many other research interests. He had read about the idea that carbon dioxide from fossil fuels could warm the planet, and realized radiocarbon dating could be used to measure how quickly the ocean absorbs CO2. Revelle understood the importance of a community effort, so he persuaded a number of colleagues to do similar analysis, and in a coordinated set of three papers [Craig, 1957; Revelle & Süess, 1957; and Arnold & Anderson, 1957], published in 1957, the group presented their results.

They all found a consistent pattern: the surface layer of the ocean continuously absorbs CO2 from the atmosphere, so on average, a molecule of CO2 molecule stays in the atmosphere only for about 7 years, before being dissolved into the ocean. But the surface waters also release CO2, especially when they warm up in the sun. So the atmosphere and surface waters exchange CO2 molecules continuously—any extra CO2 will end up shared between them

All three papers also confirmed that the surface waters don’t mix much with the deeper ocean. So it takes hundreds of years for any extra carbon to pass down into deeper waters. The implications were clear—the oceans weren’t absorbing CO2 anywhere near as fast as we were producing it.

These findings set alarm bells ringing amongst the geosciences community. If this was correct, the effects of climate change would be noticeable within a few decades. But without data, it would be hard to test their prediction. At Scripps, Revelle hired a young chemist, David Charles Keeling, to begin detailed measurements. In 1958, Keeling set up an observing station on Mauna Loa in Hawaii, and a second station in the Antarctic, both far enough from any major sources of emissions to give a reliable baseline measurements of CO2 in the atmosphere. Funding for the Antarctic station was cut a few years later, but Keeling managed to keep the recordings going at Mauna Loa, where they are still collected regularly today. Within two years, Keeling had enough data to confirm Bolin and Ericsson’s analysis: CO2 levels in the atmosphere were rising sharply.

Keeling’s data helped to spread awareness of the issue rapidly among the ocean and atmospheric science research communities, even as scientists in other fields remained unaware of the issue. Alarm at the implications of the speed at which CO2 levels were rising led some scientists to alert the country’s political leaders. When President Lyndon Johnson commissioned a report on the state of the environment, in 1964, the president’s science advisory committee invited a small subcommittee—including Revelle, Keeling, and Smagorinsky—to write an appendix to the report, focusing on the threat of climate change. And so, on February 8th, 1965, President Johnson became the first major world leader to mention the threat of climate change, in speech to congress: “This generation has altered the composition of the atmosphere on a global scale through…a steady increase in carbon dioxide from the burning of fossil fuels.”

Climate Modeling Takes Off

So awareness of the CO2 problem was spreading rapidly through the scientific community just as the general circulation modelling community was getting established. However, it wasn’t clear that global circulation models would be suited to this task. Computational power was limited, and it wasn’t yet possible to run the models long enough to simulate the decades or centuries over which climate change would occur. Besides, the first generation of GCMs had so many simplifications, it seemed unlikely they could simulate the effects of increasing CO2—that wasn’t what they were designed for.

To do this properly, the models would need to include all the relevant energy exchanges between the surface, atmosphere and space. That would mean a model that accurately captured the vertical temperature profile of the atmosphere, along with the process of radiation, convection, evaporation and precipitation, all of which move energy vertically. None of these processes are adequately captured in the primitive equations, so they would all need to be added as parameterization schemes in the models.

Smagorinsky and Manabe at GFDL were the only group anywhere near ready to try running CO2 experiments in their global circulation model. Their nine-layer model already captured some of the vertical structure of the atmosphere, and Suki Manabe had built in a detailed radiation code from the start, with the help of a visiting German meteorologist, Fritz Möller. Manabe had a model of the relevant heat exchanges in the full height of the atmosphere working by 1967, and together with his colleague, Richard Wetherald, published what is now recognized as the first accurate computational experiment of climate change [Manabe and Wetherald, 1967].

Running the general circulation model for this experiment was still too computationally expensive, so they ignored all horizontal heat exchanges, and instead built a one dimensional model of just a single column of atmosphere. The model could be run with 9 or 18 layers, and included the effects of upwards and downwards radiation through the column, exchanges of heat through convection, and the latent heat of evaporation and condensation of water. Manabe and Wetherald first tested the model with current atmospheric conditions, to check it could reproduce the correct vertical distribution of temperatures in the atmosphere, which it did very well. They then doubled the amount of carbon dioxide in the model and ran it again. They found temperatures rose throughout the lower atmosphere, with a rise of about 2°C at the surface, while the stratosphere showed a corresponding cooling. This pattern—warming in the lower atmosphere and cooling in the stratosphere—shows up in all the modern global climate models, but wasn’t confirmed by satellite readings until the 2000s.

By the mid 1970s, a broad community of scientists were replicating Manabe and Wetherald’s experiment in a variety of simplified models, although it would take nearly a decade before anyone could run the experiment in a full 3-dimensional GCM. But the community was beginning to use the term climate modelling to describe their work—a term given much greater impetus when it was used as the title of a comprehensive survey of the field by two NCAR scientists, Steven Schneider and Robert Dickinson in 1975. Remarkably, their paper [Schneider and Dickinson, 1974] charts a massive growth of research, citing the work of over 150 authors who published work on climate modelling the in period from 1967-1975, after Manabe and Wetherald’s original experiment.

It took some time, however, to get the general circulation models to the point where they could also run a global climate change experiment. Perhaps unsurprisingly, Manabe and Wetherald were also the first to do this, in 1975. Their GCM produced a higher result for the doubled CO2 experiment—an average surface warming of 3°C—and they attributed this to the snow-albedo feedback, which is included in the GCM, but not in their original single column model. Their experiment [Manabe and Wetherald 1975] also showed an important effect first noted by Arrhenius: a much greater warming at the poles than towards the equator—because polar temperatures are much more sensitive to changes in the rate at which heat escapes to space. And their model predicted another effect—global warming would speed up evaporation and precipitation, and hence produce more intense rainfalls. This prediction has already been demonstrated in the rapid uptick of extreme weather events in the 2010s.

In hindsight, Manabe’s simplified models produced remarkably accurate predictions of future climate change. Manabe used his early experiments to predict a temperature rise of about 0.8°C by the year 2000, assuming a 25% increase in CO2 over the course of the twentieth century. Manabe’s assumption about the rate that CO2 would increase was almost spot on, and so was his calculation for the resulting temperature rise. CO2 levels rose from about 300ppm in 1900 to 370ppm in 2000, a rise of 23%. The change in temperature over this period, calculated as the change in decadal means in the HadCRUT5 dataset was 0.82°C. [Hausfather et al 2020].

References

Arnold, J. R., & Anderson, E. C. (1957). The Distribution of Carbon-14 in Nature. Tellus, 9(1), 28–32.

Craig, H. (1957). The Natural Distribution of Radiocarbon and the Exchange Time of Carbon Dioxide Between Atmosphere and Sea. Tellus, 9(1), 1–17.

Hausfather, Z., Drake, H. F., Abbott, T., & Schmidt, G. A. (2020). Evaluating the Performance of Past Climate Model Projections. Geophysical Research Letters, 47(1), 2019GL085378.

Manabe, S., & Wetherald, R. T. (1967). Thermal Equilibrium of the Atmosphere with a Given Distribution of Relative Humidity. Journal of the Atmospheric Sciences.

Manabe, S., & Wetherald, R. T. (1975). The Effects of Doubling the CO2 Concentration on the Climate of a General Circulation Model. Journal of the Atmospheric Sciences, 32(1), 3–15.

Revelle, R., & Suess, H. E. (1957). Carbon Dioxide Exchange Between Atmosphere and Ocean and the Question of an Increase of Atmospheric CO 2 during the Past Decades. Tellus, 9(1), 18–27.

Schneider, S. H., & Dickinson, R. E. (1974). Climate modeling. Reviews of Geophysics, 12(3), 447.

Leave a Reply

Your email address will not be published. Required fields are marked *