The Future of Climate Changes: The current state of climate change science

Ear­lier this week, the public learned the details of the upcoming fifth assess­ment report (or, “AR5”) of the United Nations’ Inter­gov­ern­mental Panel on Cli­mate Change, an inter­na­tional body whose man­date is not to do new sci­ence, but to assess the state of the existing sci­ence. IPCC assess­ments cover a lot of ground, from the basics of phys­ical sci­ence to adap­ta­tion and mit­i­ga­tion. Among other fac­tors, the AR5 is expected to report that cli­mate change is almost cer­tainly caused by human activ­i­ties, that the sea level is expected to rise more than pre­vi­ously esti­mated, and that the best-​​case-​​scenario global tem­per­a­ture increase could be lower than pre­vi­ously esti­mated. We asked Auroop Gan­guly, a civil and envi­ron­mental engi­neering asso­ciate pro­fessor whose exper­tise lies in cli­mate change and extreme weather, to dis­cuss the report and what its con­clu­sions mean for devel­oping long-​​term solutions. 

 

How do scientists determine whether temperature rise is a result of human activities and why is their certainty that it has increased over the last five years?

The scientific consensus typically relies on multiple lines of evidence. However, the most common approach that ultimately leads to assigning numbers is based on what is called “fingerprinting.” One way to think of these is as a bunch of model-driven, physics-guided, and statistically-based approaches that attempt to delineate just how much of the warming is a result of human-induced emissions versus how much may be attributed to natural variability. Climate models are run in historical time periods with and without anthropogenic greenhouse gas emissions, and then compared with each other and with observations. Statistical methods attribute temperature trends and deviations to natural variability, and then delineate that portion that can only be explained when human emissions are taken into account, but not otherwise.

The underlying observations and models, as well as the statistical methods and our understanding of the physics, have been steadily improving. Hence, we see the increase in certainty over the years. However, large uncertainties remain. One persistent issue has been the impact of clouds. Just last year, two top climate scientists published two papers with different results for climate sensitivity with regard to how much additional warming to expect with the doubling of atmospheric carbon dioxide. Scientists and science communicators are faced with the daunting task of conveying the basic message and their relevance clearly and convincingly, without either de-emphasizing or over-emphasizing the uncertainties. The IPCC is tasked with a difficult job.

The report notes that a key challenge for climate scientists is making reliable predictions about changes at the local scale. Why is this more difficult than making global predictions?

Certainly it is easier for your eyes to glaze over a few degrees of warming. Here in Boston, we can have more temperature fluctuations in a single day, and sometimes in a few hours, than global average temperature increases over the last decade or even century. Once we realize that even a few degrees of increase in global average temperatures may lead to hotter heat waves, perhaps not offer much respite in terms of the more intense cold snaps, and intensify heavy precipitation across multiple regions of the world, we begin to comprehend the nature of the problem. This is why some would prefer the phrase “global weirding” to global warming. When we start to think of what these could mean in the context of catastrophic consequences, particularly in urban areas, we can understand the implications. Regional and even local projections are needed to translate climate change knowledge to actionable information. This includes things such as how to design buildings, protect transportation networks and infrastructures, enhance dams and reservoirs, safeguard nuclear power plants from floods, develop sensor-based early warning systems, and make coastal cities more resilient to hurricanes and storm surge.

At the risk of over-simplification, the lack of predictability arises from three primary factors. First, our physical understanding of the fine-scale processes of interest and our ability to encode them within computer models are limited. These include processes such as convection that may cause heavy precipitation and consequent flash floods, as well as the formation and evolution of tropical cyclones or hurricanes. Second, averages are often easier to predict statistically, just as one coin toss becomes a game of luck while 1,000 tosses are more likely to lead to about 500 heads. Advances in physics and computer models need to go hand in hand with more sophisticated statistical analysis and data-driven methods, and both require more and higher quality data. Third, what complicates matters further is that planning horizons for adaptation decisions are typically not much beyond a couple of decades. At these time scales, the intrinsic variability of the nonlinear climate system may be hard to separate from a change signal. This intrinsic variability has many manifestations, and often relate to extreme sensitivity to initial conditions. Under such situations, the uncertainties need to be characterized in a comprehensive manner before the climate projections can be used for planning or adaptation purposes.

What implications does this challenge have on developing long-term solutions?

The implications are serious. The effects of Superstorm Sandy in the New York / New Jersey area are a case in point. However, given the uncertainties at local to regional scales, and the fact that adaptation measures need to be taken urgently, there is a danger of making decisions that are sub-optimal. This is especially true when resources are limited, especially since resilience measures for say, high winds, or large precipitation rates, or major storm surges, may need to be different. As one other example, consider vulnerable regions of the world, where precipitation patterns directly relate to food and water security as well as flood hazards, leading in turn to severe loss of lives and property. A prediction of consistent and continuous intensification of rainfall extremes may lead to more spending on uniform flood hazards preparedness. However, a prediction of increasing variability in these extremes may necessitate better understanding of adaptive management, both for flood hazards preparedness, but also for water harvesting and distribution policy and infrastructures. These are situations where understanding the uncertainties remains critical; in fact, quick and dirty solutions, however tempting or well meaning, can have drastic negative consequences rather than helping save lives or reducing damage to property.

Thus, long-term developments need to be in areas such as comprehensive uncertainty characterization; in bringing to bear the collective power of models, data, physics, and statistics to reduce uncertainty where possible; and in achieving a balance between the need to develop urgent and proactive solutions while exercising caution so that the proposed remedies do not end up hurting rather than helping. Extracting insights despite the uncertainties and managing in an adaptive fashion that balances the various constraints need to be urgent national and societal priorities going forward.

 

Related Faculty: Auroop R. Ganguly

Related Departments:Civil & Environmental Engineering