Climate models are vastly more sophisticated than they were 30 years ago, but they can’t say exactly how much the temperature will go up by 2100. Depending on the assumptions modelers make, the likely increase — assuming emissions of greenhouse gases keep rising — range from about 3° Fahrenheit to 8°F. But a study released Thursday in Science argues that the warming will probably wind up on the higher end of that spectrum.
The reason, said lead author John Fasullo of the National Center for Atmospheric Research, “has mostly to do with clouds.” Climate scientists have long known that depending on how they respond to a warming world, clouds could boost the heat-trapping properties of carbon dioxide a little or a lot, by reflecting sunlight back into space or trapping extra heat or both. Despite years of hard work by some of the smartest people around, however, nobody has figured out which.
That’s no surprise, according to Fasullo. “If you look out the window, it’s clear that clouds are very complicated,” he said. “They have nuanced shapes, they can vary in how broad and how thick and how dark or bright they are in cooler. If you look at the droplets they’re made of, the size can vary, and that changes the problem entirely.”
Those complexities not only make clouds hard to observe in detail, even with satellites, but also make them hard to simulate in climate models.
So rather than focus on the clouds themselves, Fasullo and co-author Kevin Trenberth looked at the environment in which clouds form. In particular, they looked at relative humidity — the amount of water vapor a given patch of atmosphere is holding at a given moment, compared with the amount it could hold in theory. “It’s one of the foundations of clouds,” Fasullo said. “It determines whether a cloud exists at all.”
Unlike clouds themselves, he said, “we have very good observations of relative humidity from NASA satellites, so we can evaluate how good a job models do of simulating it.”
It turns out that in general, the climate models that do the best job of reproducing real-world, relative-humidity patterns also tend to project higher temperatures.
Fasullo said the reason is that “these models produce dry zones in the subtropics.” That presumably leads to fewer clouds, allowing more sunlight into warm ocean waters, driving up global average temperatures.
“We care most about the tropics and subtropics because sunlight is most direct and strongest there,” he said, explaining that changes in cloud cover at higher latitudes don’t make as much of a difference to world temperatures.
Fasullo and Trenberth’s end-run around uncertainties in cloud behavior are undeniably creative, but that doesn’t mean they’ve put the problem to bed.
“I’d be the first to say we haven’t solved this question [of uncertainties in future temperatures],” Fasullo said. “But it’s a festering problem that a lot of people have looked at. The general reception we’re getting from colleagues is that it’s a creative way of looking at it.”
The initial feedback bears him out. “It’s a very clever idea,” said Andrew Dessler, a climate scientist at Texas A&M University who studies clouds, among other things. “And it may well be right. They’re sensible people, and I have a lot of respect for them. But more work is needed to flesh out the details.”
In a commentary also appearing in Science, Karen Shell of Oregon State University wrote that Trenberth and Fasullo’s approach “is an encouraging step that links observations to climate sensitivity” — that is, to the amount of heating a doubling of CO2 will cause.
But the case isn’t ironclad: relative humidity is clearly related to cloud formation, she noted, but there could be plenty of other factors that are nearly as important.
“In retrospect, this could turn out to have been a breakthrough,” Dressler said. “But we won’t know that for a while.”