I love working in a University. Every day I encounter new ideas, and I get to chat to some of the smartest people on the planet. But I see signs, almost every day, that universities are poorly equipped to face the complex challenges of the 21st Century. Challenges like poverty, climate change, resource depletion, sustainable agriculture, and so on. The problem is that universities are organized into departments that correspond to disciplines like physics, computer science, sociology, geography, etc. Most of the strategic decision-making is made in these departments – which faculty to hire, which degree programs to offer, what research to focus on. But the grand challenges of the 21st Century are trans-disciplinary. To address them, we need people who can transcend their own disciplinary background; people who are not only comfortable working with a range of experts from many different fields, but who actively go out and seek such interactions. In marketing speak, we need T-shaped people:

Jelinski, who is vice chancellor for research and graduate studies at Louisiana State University, talked about a new “T-shaped” person with disciplinary depth, in biology for example, but with the ability, or arms, to reach out to other disciplines. “We need to encourage this new breed of scientist,” she said. [“Researchers Seek Basics Of Nano Scale,” The Scientist, August 21, 2000]

Universities don’t do this well because the entire reward structure for departments and professors is based purely on demonstrating disciplinary strength. Occasionally, we manage to establish inter-disciplinary centres and institutes, to act as places where faculty and students from different departments can come together and learn how to collaborate. A few of these prosper, but most of them get shut down fairly rapidly by the university. Here’s what happens. A new centre is set up with an initial research grant, perhaps for 3-5 years, which typically pays only for researchers’ salaries and equipment. The university agrees to provide space, administrative staff, and pay the utility bills for a limited time, because opening a new facility is good press, but then expects each centre to become “self-sufficient”. This is, of course, impossible, because no granting agency ever covers the full cost of running a research centre. The professors who want to make the centre succeed spend most of their time writing more grant proposals, most of which don’t get funded because competition for funding is tough. Nobody has much time to do the important inter-disciplinary work that the centre was established for. After five years, the university shuts it down because it didn’t become self-sufficient. A research centre at U of T that I’ve spent a lot of time at over the past few years is being shut down this month for these very reasons.

The same thing happens to inter-disciplinary graduate programs. While departments run graduate programs focusing on disciplinary strength, some enterprising professors do manage to set up “collaborative programs”, which students from a range of participating departments can sign up for. The collaborative programs are set up using seed money, some of which is donated by the participating departments, and some of which comes from the university teaching initiative funds, because they all agree the program is a good idea, and the students will benefit. However, after a few years, the seed money has been used up, and no unit within the university will kick in more, because the program is supposed to be “self-sufficient”. No such program can ever be self sufficient, because the students who participate are accounted for in their home departments. The collaborative program doesn’t generate any extra revenue, and the departments view it as ‘stealing their students’. Without funding, the program shuts down. A collaborative graduate program at U of T that I serve on the advisory board for is ending this month for these very reasons.

Not only does the university structure tend to squeeze out anything that does not fit into a neat disciplinary silo, it also generates rules that actively prevent students acquiring the skills needed to be “T-shaped”. For example, my own department has “breadth requirements” that graduate students have to meet when selecting a set of courses. “Breadth” here means breadth across the discipline. So students have to demonstrate they’ve taken courses that cover several different subfields of computer science, and several different research methodologies within the field. But this is the opposite of T-shaped! A T-shaped student has disciplinary *depth* and *inter-disciplinary* breadth. This would mean deep expertise in a particular subfield of computer science, and the ability to apply that expertise in many different contexts outside of computer science. Instead, we prevent students from getting the depth by forcing them to take more introductory courses within computer science, and we prevent them from getting inter-discipinary breadth for the same reason.

Working within a university sometimes feels like the intellectual equivalent of being at a lavish buffet but prevented from ever leaving the pasta section.

A high school student in Ottawa, Jin, writes to ask me for help with a theme on the question of whether global warming is caused by human activities. Here’s my answer:

The simple answer is ‘yes’, global warming is caused by human activities. In fact we’ve known this for over 100 years. Scientists in the 19th Century realized that some gases in the atmosphere help to keep the planet warm by stopping the earth losing heat to outer space, just like a blanket keeps you warm by trapping heat near your body. The most important of these gases is Carbon Dioxide (CO2). If there were no CO2 in the atmosphere, the entire earth would be a frozen ball of ice. Luckily, that CO2 keeps the planet at the temperatures that are suitable for human life. But as we dig up coal and oil and natural gas, and burn them for energy, we increase the amount of CO2 in the atmosphere and hence we increase the temperature of the planet. Now, while scientists have known this since the 19th century, it’s only in the last 30 years that scientists were able to calculate precisely how fast the earth would warm up, and which parts of the planet would be affected the most.

Here are three really good explanations, which might help you for your theme:

  1. NASA’s Climate Kids website:
    http://climatekids.nasa.gov/big-questions/
    It’s probably written for kids younger than you, but has really simple explanations, in case anything isn’t clear.
  2. Climate Change in a Nutshell – a set of short videos that I really like:
    http://www.planetnutshell.com/climate
  3. The IPCC’s frequently asked question list. The IPCC is the international panel on climate change, whose job is to summarize what scientists know, so that politicians can make good decisions. Their reports can be a bit technical, but have a lot more detail than most other material:
    http://www.ipcc.ch/publications_and_data/ar4/wg1/en/faqs.html

Also, you might find this interesting. It’s a list of successful predictions by climate scientists. One of the best ways we know that science is right about something is that we are able to use our theories to predict what while happen in the future. When those predictions turn out to be correct, it gives us a lot more confidence that the theories are right: http://www.easterbrook.ca/steve/?p=3031

By the way, if you use google to search for information about global warming or climate change, you’ll find lots of confusing information, and different opinions. You might wonder why that is, if scientists are so sure about the causes of climate change. There’s a simple reason. Climate change is a really big problem, one that’s very hard to deal with. Most of our energy supply comes from fossil fuels, in one way or another. To prevent dangerous levels of warming, we have to stop using them. How we do that is hard for many people to think about. We really don’t want to stop using them, because the cheap energy from fossil fuels powers our cars, heats our homes, gives us cheap flights, powers our factories, and so on.

For many people it’s easier to choose not to believe in global warming than it is to think about how we would give up fossil fuels. Unfortunately, our climate doesn’t care what we believe – it’s changing anyway, and the warming is accelerating. Luckily, humans are very intelligent, and good at inventing things. If we can understand the problem, then we should be able to solve it. But it will require people to think clearly about it, and not to fool themselves by wishing the problem away.

A few weeks back, Randall Munroe (of XKCD fame) attempted to explain the parts of a Saturn V rocket (“Up Goer Five”) using only the most common one thousand words of English. I like the idea, but found many of his phrasings awkward, and some were far harder to understand than if he’d used the usual word.

Now there’s a web-based editor that let’s everyone try their hand at this, and a tumblr of scientists trying to explain their work this way. Some of them are brilliant, but many almost unreadable. It turns out this is much harder than it looks.

Here’s mine. I cheated once, by introducing one new word that’s not on the list, although it’s not really cheating because the whole point of science education is to equip people the right words and concepts to talk about important stuff:

If the world gets hotter or colder, we call that ‘climate’ change. I study how people use computers to understand such change, and to help them decide what we should do about it. The computers they use are very big and fast, but they are hard to work with. My job is to help them check that the computers are working right, and that the answers they get from the computers make sense. I also study what other things people want to know about how the world will change as it gets hotter, and how we can make the answers to their questions easier to understand.

[Update] And here’s a few others that I think are brilliant:

Emily S. Cassidy, Environmental Scientist at University of Minnesota:

In 50 years the world will need to grow two times as much food as we grow today. Meeting these growing needs for food will be hard because we need to make sure meeting these needs doesn’t lead to cutting down more trees or hurting living things. In the past when we wanted more food we cut down a lot of trees, so we could use the land. So how are we going to grow more food without cutting down more trees? One answer to this problem is looking at how we use the food we grow today. People eat food, but food is also used to make animals and run cars. In fact, animals eat over one-third of the food we grow. In some places, animals eat over two-thirds of the food grown! If the world used all of the food we grow for people, instead of animals and cars, we could have 70% more food and that would be enough food for a lot of people!

Anthony Finkelstein, at University College London, explaining requirements analysis:

I am interested in computers and how we can get them to do what we want. Sometimes they do not do what we expect because we got something wrong. I would like to know this before we use the computer to do something important and before we spend too much time and money. Sometimes they do something wrong because we did not ask the people who will be using them what they wanted the computer to do. This is not as easy as it sounds! Often these people do not agree with each other and do not understand what it is possible for the computer to do. When we know what they want the computer to do we must write it down in a way that people building the computer can also understand it.

29. May 2012 · 5 comments · Categories: education

A few people today have pointed me at the new paper by Dan Kahan & colleagues (1), which explores competing explanations for why lots of people don’t regard climate change as a serious problem. I’ve blogged about Dan’s work before – their earlier studies are well designed, and address important questions. If you’re familiar with their work, the new study isn’t surprising. They find that people’s level of concern over climate change doesn’t correlate with their level of science literacy, but does correlate with their cultural philosophies. In this experiment, there was no difference in science literacy between people who are concerned about climate change and those who are not. They use this to build a conclusion that giving people more facts is not likely to change their minds:

A communication strategy that focuses only on transmission of sound scientific information, our results suggest, is unlikely to do that.

Which is reasonable advice, because science communication must address how different people see the world, and how people filter information based on their existing worldview:

Effective strategies include use of culturally diverse communicators, whose affinity with different communities enhances their credibility, and information-framing techniques that invest policy solutions with resonances congenial to diverse groups

Naturally, some disreputable newsites spun the basic finding as a “Global warming skeptics as knowledgeable about science as climate change believers, study says“. Which is not what the study says at all, because it didn’t measure what people know about the science.

The problem is that there’s an unacknowledged construct validity problem in the study. At the beginning of the paper, the authors talk about “the science comprehension thesis (SCT)”:

As members of the public do not know what scientists know, or think the way scientists think, they predictably fail to take climate change as seriously as scientists believe they should.

…which they then claim their study disproves. But when you get into the actual study design, they quickly switch to talking about science literacy:

We measured respondents’ science literacy with National Science Foundation’s (NSF) Science and Engineering Indicators. Focused on physics and biology (for example, ‘Electrons are smaller than atoms [true/false]’; ‘Antibiotics kill viruses as well as bacteria [true/false]’), the NSF Indicators are widely used as an index of public comprehension of basic science.

But the problem is, science comprehension cannot be measured by asking a whole bunch of true/false questions about scientific “facts”! All that measures is the ability to do well in a pub trivia quizzes.

Unfortunately, this mistake is widespread, and leads to an education strategy that fills students’ heads with a whole bunch of disconnected science trivia, and no appreciation for what science is really all about. When high school students learn chemistry, for example, they have to follow recipes from a textbook, and get the “right” results. If their results don’t match the textbooks, they get poor marks. When they’re given science tests (like the NSF one used in this study), they’re given the message that there’s a right and wrong answer to each question, and you just gotta know it. But that’s about as far from real science as you can get! It’s when the experiment gives surprising results that the real scientific process kicks in. Science isn’t about getting the textbook answer, it’s about asking interesting questions, and finding sound methods for answering them. The myths about science are ground into kids from an early age by people who teach science as a bunch of facts (3).

At the core of the problem is a failure to make the distinction that Maienschein points out between science literacy and scientific literacy (2). The NSF instrument measures the former. But science comprehension is about the latter – it…

…emphasizes scientific ways of knowing and the process of thinking critically and creatively about the natural world.

So, to Kahan’s interpretation of the results, I would add another hypothesis: we should actually start teaching people about what scientists do, how they work, and what it means to collect and analyze large bodies of evidence. How results (yes, even published ones) often turn out to be wrong, and what matters is the accumulation of evidence over time, rather than any individual result. After all, with Google, you can now quickly find a published result to support just about any crazy claim. We need to teach people why that’s not science.

Update (May 30, 2012): Several people suggested I should also point out that the science literacy test they used is for basic science questions across physics and biology; they did not attempt to test in any way people’s knowledge of climate science. So that seriously dents their conclusion: The study says nothing about whether giving people more facts about climate science is likely to make a difference.

Update2 (May 31, 2012): Some folks on Twitter argued with my statement “concern over climate change doesn’t correlate with … level of science literacy”. Apparently none of them have a clue how to interpret statistical analysis in the behavioural sciences. Here’s Dan Kahan on the topic. (h/t to Tamsin).

References:

(1) Kahan, D., Peters, E., Wittlin, M., Slovic, P., Ouellette, L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks Nature Climate Change DOI: 10.1038/nclimate1547
(2) Maienschein, J. (1998). Scientific Literacy Science, 281 (5379), 917-917 DOI: 10.1126/science.281.5379.917
(3) William F. McComas (1998). The Principle Elements of the Nature of Science: Dispelling the Myths The Nature of Science in Science Education

This week, I’m featuring some of the best blog posts written by the students on my first year undergraduate course, PMU199 Climate Change: Software, Science and Society. This post is by Harry, and it first appeared on the course blog on January 29.

Projections from global climate models indicate that continued 21st century increases in emissions of greenhouse gases will cause the temperature of the globe to increase by a few degrees. These global changes in a few degrees could have a huge impact on our planet. Whether a few global degrees cooler could lead to another ice age, a few global degrees warmer enables the world to witness more of nature’s most terrifying phenomenon.

According to Anthony D. Del Genio the surface of the earth heats up from sunlight and other thermal radiation, the amount of energy accumulated must be offset to maintain a stable temperature. Our planet does this by evaporating water that condenses and rises upwards with buoyant warm air. This removes any excess heat from the surface and into higher altitudes. In cases of powerful updrafts, the evaporated water droplets easily rise upwards, supercooling them to a temperature between -10 and -40°C. The collision of water droplets with soft ice crystals forms a dense mixture of ice pellets called graupel. The densities of graupel and ice crystals and the electrical charges they induce are two essential factors in producing what people see as lightning.

Ocean and land differences in updrafts also cause higher lightning frequencies. Over the course of the day, heat is absorbed by the oceans and hardly warms up. Land surfaces, on the other hand, cannot store heat and so they warm significantly from the beginning of the day. The great deal of the air above land surfaces is warmer and more buoyant than that over the oceans, creating strong convective storms as the warm air rises. The powerful updrafts, as a result of the convective storms, are more prone to generate lightning.

According to the general circulation model by Goddard Institute for Space Studies, one of the two experiments conducted indicates that a 4.2°C global warming suggests an increase of 30% in global lightning activity. The second experiment indicated that a 5.9°C global cooling would cause a 24% decrease in global lightning frequencies. The summaries of the experiments signifies a 5-6% change in global lightning frequency for every 1°C of global warming or cooling.

As 21st century projections of carbon dioxide and other greenhouse gases emission remain true, the earth continues to warm and the ocean evaporates more water. This is largely because the drier land surface is unable to evaporate water at the same extent as the oceans, causing the land to warm more. This should cause stronger convective storms and produce higher lightning occurrence.

Greater lightning frequencies can contribute to a warmer earth. Lightning provides an abundant source of nitrogen oxides, which is a precursor for ozone production in the troposphere. The presence of ozone in the upper troposphere acts as a greenhouse gas that absorbs some of the infrared energy emitted by earth. Because tropospheric ozone traps some of the escaping heat, the earth warms and the occurence of lightning is even greater. Lightning frequencies creates a positive feedback process on our climate system. The impact of ozone on the climate is much stronger than carbon, especially on a per-molecule basis, since ozone has a radiative forcing effect that is approximately 1,000 times as powerful as carbon dioxide. Luckily, the presence of ozone in the troposphere on a global scale is not as prevalent as carbon and its atmospheric lifetime averages to 22 days.

"Climate simulations, which were generated from four Global General Circulation Models (GCM), were used to project forest fire danger levels with relation to global warming."

Lightning occurs more frequently around the world, however lightning only affects a very local scale. The  local effect of lightning is what has the most impact on people. In the event of a thunderstorm, an increase in lightning frequencies places areas with high concentration of trees at high-risk of forest fire. Such areas in Canada are West-Central and North-western woodland areas where they pose as major targets for ignition by lightning. In fact, lightning accounted for 85% of that total area burned from 1959-1999. To preserve habitats for animals and forests for its function as a carbon sink, strenuous pressure on the government must be taken to ensure minimized forest fire in the regions. With 21st century estimates of increased temperature, the figure of 85% of area burned could dramatically increase, burning larger lands of forests. This is attributed to the rise of temperatures simultaneously as surfaces dry, producing more “fuel” for the fires.

Although lightning has negative effects on our climate system and the people, lightning also has positive effects on earth and for life. The ozone layer, located in the upper atmosphere, prevents ultraviolet light from reaching earth’s surface. Also, lightning causes a natural process known as nitrogen fixation. This process has a fundamental role for life because fixed nitrogen is required to construct basic building blocks of life (e.g. nucleotides for DNA and amino acids for proteins).

Lightning is an amazing and natural occurrence in our skies. Whether it’s a sight to behold or feared, we’ll see more of it as our earth becomes warmer.

This term, I’m running my first year seminar course, “Climate Change: Software Science and Society” again. The outline has changed a little since last year, but the overall goals of the course are the same: to take a small, cross-disciplinary group of first year undergrads through some of the key ideas in climate modeling.

As last year, we’re running a course blog, and the first assignment is to write a blog post for it. Please feel free to comment on the students’ posts, but remember to keep your comments constructive!

A number of sessions at the AGU meeting this week discussed projects to improve climate literacy among different audiences:

  • The Climate Literacy and Energy Awareness Network (CLEANET), are developing concept maps for use in middle school and high school, along with a large set of pointers to educational resources on climate and energy for use in the classroom.
  • The Climate Literacy Zoo Education Network (CliZEN). Michael Mann (of Hockey Stick Fame) talked about this project, which was a rather nice uplifting change from hearing about his experiences with political attacks on his work. This is a pilot effort, currently involving ten zoos, mainly in the north east US. So far, they have completed a visitor survey across a network of zoos, plus some aquaria, exploring the views of visitors on climate change, using the categories of the Six Americas report. The data they have collected show that zoo visitors tend to be more skewed towards the “alarmed” category compared to the general US population. [Incidentally, I’m impressed with their sample size: 3,558 responses. The original Six Americas study only had 981, and most surveys in my field have much smaller sample sizes]. The next steps in the project are to build on this audience analysis to put together targeted information and education material that links what we know about climate climate with it’s impact on specific animals at the zoos (especially polar animals).
  • The Climate Interpreter Project. Bill Spitzer from the New England Aquarium talked about this project. Bill points out that aquaria (and museums, zoos etc) are have an important role to play, because people come for the experience, which must be enjoyable, but they do expect to learn something, and they do trust museums and zoos to provide them accurate information. This project focusses on the role of interpreters and volunteers, who are important because they tend to be more passionate, more knowledgeable, and come into contact with many people. But many interpreters are not yet comfortable in talking about issues around climate change. They need help, and training. Interpretation isn’t just transmission of information. It’s about translating science in a way that’s meaningful and resonates with an audience. It requires a systems perspective. The strategy adopted by this project is to begin with audience research, to understand people’s interests and passions; connect this with the cognitive and social sciences on how people learn, and how they make sense of what they’re hearing; and finally to make use of strategic framing, which gets away from the ‘crisis’ frame that dominates most news reporting (on crime, disasters, fires), but which tends to leave people feeling overwhelmed, which leads them to treat it as someone else’s problem. Thinking explicitly about framing allows you to connect information about climate change with people’s values, with what they’re passionate about, and even with their sense of self-identity. The website climateinterpreter.org describes what they’ve learnt so far
    (As an aside, Bill points out that it can’t just be about training the interpreters – you need institutional support and leadership, if they are to focus on a controversial issue. Which got me thinking about why science museums tend to avoid talking much about climate change – it’s easy for the boards of directors to avoid the issue, because of worries about whether it’s politically sensitive, and hence might affect fundraising.)
  • The WorldViews Network. Rachel Connolly from Nova/WGBH presented this collaboration between museums, scientists and TV networks. Partners include planetariums and groups interested in data visualization, GIS data, mapping, many from an astronomy background. Their 3-pronged approach, called TPACK, identifies three types of knowledge: technological, pedagogical, and content knowledge.  The aim is to take people from seeing, to knowing, to doing. For example, they might start with a dome presentation, but bring into it live and interactive web resources, and then move on to community dialogues. Storylines use visualizations that move seamlessly across scale: cosmic, global, bioregional. They draw a lot on the ideas from Rockstrom’s planetary boundaries, within which they’r focussing on three: climate, biodiversity loss and ocean acidification. A recent example from Denver, in May, focussed on water. On the cosmic scale, they look at where water comes from as planets are formed. They eventually bring this down to the bioregional scale, looking at the rivershed for Denver, and the pressures on the Colorado River. Good visual design is a crucial part of the project (Rachel showed a neat example of a visualization of the size of water on the planet: comparing all water, with fresh water and frozen water. Another fascinating example was a satellite picture of the border of Egypt and Israel, where the different water management strategies either side of the border produce a starkly visible difference either side of the border. (Rachel also recommended Sciencecafes.org and the Buckminster Fuller Challenge).
  • ClimateCommunication.org. There was a lot of talk through the week about this project, led by Susan Hassol and Richard Somerville, especially their recent paper in Physics Today, which explores the use of jargon, and how it can mislead the general public. The paper went viral on the internet shortly after it was published, and and they used an open google doc to collect many more examples. Scientists are often completely unaware the non-specialists have different meaning for jargon terms, which can  then become a barrier for communication. My favourite examples from Susan’s list are “aerosol”, which to the public means a spray can (leading to a quip by Glenn Beck who had heard that aerosols cool the planet); ‘enhanced’, which the public understands as ‘made better’ so the ‘enhanced greenhouse effect’ sounds like a good thing, and ‘positive feedback’ which also sounds like a good thing, as it suggests a reward for doing something good.
  • Finally, slightly off topic, but I was amused by the Union of Concerned Scientists’ periodic table of political interferences in science.

During a break in the workshop last week, Cecilia Bitz and I managed to compare notes on our undergrad courses. We’ve both been exploring how to teach ideas about climate modeling to students who are not majoring in earth sciences. Cecilia’s course on Weather and Climate Prediction is a little more advanced than mine, as she had a few math and physics pre-requisites, while mine was open to any first year students. For example, Cecilia managed to get the students to run CAM, and experiment with altering the earth’s orbit. It’s an interesting exercise, as it should lead to plenty of insights into connections between different processes in the earth’s system. One of the challenges is that earth system models aren’t necessarily geared up for this kind of tinkering, so you need good expertise in the model being used to understand which kinds of experiments are likely to make sense. But even so, I’m keen to explore this further, as I think the ability to tinker with such models could be an important tool for promoting a better understanding of how the climate system works, even for younger kids

Part of what’s missing is elegant user interfaces. EdGCM is better, but still very awkward to use. What I really want is something that’s as intuitive as Angry Birds. Okay, so I’m going to have to compromise somewhere – nonlinear dynamics are a bit more complicated than the flight trajectories of avian slingshot.

But that’s not all – someone needs to figure out what kinds of experiments students (and school kids) might want to perform, and provide the appropriate control widgets, so they don’t have mess around with code and scripts. Rich Loft tells me there’s a project in the works to do something like this with CESM – I’ll looking forward to that! In the meantime, Rich suggested two examples of simple simulations of dynamical systems that get closer to what I’m looking for:

  • The Shodor Disease model that lets you explore the dynamics of epidemics, with people in separate rooms, where you can adjust how much they can move between rooms, how the disease works, and whether immunization is available. Counter-intuitive lesson: crank up the mortality rate to 100% and (almost) everyone survives!
  • The Shodor Rabbits and Wolves simulation, which lets you explore population dynamics of predators and prey. Counter-intuitive lesson: double the lifespan of the rabbits and they all die out pretty quickly!