How can scientists better explain climate risks?

"Preparedness can save an awful lot of lives and an awful lot of money, so it's the mindset we all need to develop when it comes to climate change," says Baylor Fox-Kemper. (Credit: Getty Images)

Climate scientists have long struggled to find the best ways to present crucial facts about future sea level rise to policymakers, stakeholders, and the general public, according to a new study.

But on a positive note, they have started to improve that ability in recent years.

The researchers analyzed decades of language and graphics used in the United Nations’ Intergovernmental Panel on Climate Change climate assessment reports, highlighting areas of success and identifying areas where language can be improved.

This includes language communicating uncertainty that surrounds future sea level projections, which the analysis found has often been oversimplified or confusing in reports and could potentially lead policymakers to underestimate outcomes and alter plans that counter some of the worst effects of rising waters.

Baylor Fox-Kemper, a professor of earth, environmental, and planetary sciences at Brown University, is a coauthor of the study published in Nature Climate Change and also the lead author of the oceans, ice, and sea-level rise chapter in the IPCC’s Sixth Assessment Physical Science Basis Report.

Here, he shares details about key findings from the latest study and why it’s so difficult to prompt urgent action when communicating about climate change:

Q

For this study, the research team set out to review language and graphics about sea level rise used in climate reports between 1990 and 2021. What were you looking to determine?

A

Scientists who work on these kinds of reports are trying to communicate as clearly as possible the latest science so that decision makers can make policies. Scientists are not policymakers or politicians or philosophers, and keep in mind the IPCC doesn’t make policy recommendations. In fact, typically scientists are kind of bad at recommending policy—but what we’re very good at is presenting information that’s useful for making policy.

What happens is you then have this communication challenge: sometimes policymakers want to know things that we didn’t tell them about or they might misinterpret an unclear figure. When that happens, the whole process falls apart. It means action isn’t taken, or not enough action is taken, meaning sometimes money and resources are just wasted. There is a continual refinement of communication tools and on what is the best method we have for improving the science to policy pipeline. That’s what this paper is really about. We say here’s something we didn’t do right, and here’s where we think we could have done better.

Q

Based on the findings, what has worked well over the last three decades?

A

In the first IPCC reports, the questions they were asking were: “Is climate change a real thing? Has it been detected? How big is it going to get in the future?” We’ve gotten quite good at answering some of those questions. It is a thing that can be detected—in a lot of different ways, in fact. Those communication points are getting very clear.

What’s to come in the future, on the other hand, relies on models and other tools. There’s been a lot of evolution about how well we understand and present models. A big innovation has been in using an ensemble of models instead of a single model. That helps account for or helps take different or similar starting points and factor in different conditions. We use that spread to quantify our ignorance and give an impression of possible outcomes.

Q

The paper details the need to improve language and graphics communicating uncertainty. What is uncertainty when it comes to climate projections, and why is it so important in communicating climate science?

A

In previous reports, the way sea level rise was presented didn’t capture what’s called deep uncertainty. This is the type of uncertainty whose likelihood you can’t quantify. For example, in an ensemble, one model gives you one high end of projections and another model is a bit higher or lower with others ranging between. But that still doesn’t account for other factors that you know or are pretty sure exist but can’t account for. It’s why we call it deep uncertainty—it is something that we think is plausible physically, but we don’t know if it’s included in the right way in models.

The problem then becomes how to communicate deep uncertainty in a way that’s useful. There are different kinds of uncertainty, too. Some years are El Niño years—that changes things. We don’t know what humans are going to do, so we make different scenarios of what humans might choose and put those together in a model. Deep uncertainty lies in the “known unknowns”—we know they’re important, but it’s hard to quantify their impact by the standard methods. Another category that always intrigues me is the “unknown knowns”—in other words, the assumptions we’re making that we don’t even know we’re making.

This is all important when it comes to actions or policies that come as a result of our reports and their projections. For example, if we say sea level rise will be 1 meter—or could be as much as a meter more than that—and you build a seawall that’s 1.25 meters or whatever measurement we give, then everyone will be very upset when it’s actually 2 meters of rise because we couldn’t effectively factor in and communicate the deep uncertainties that could lead to more sea-level rise than we were able to quantify normally.

Q

Can you expand the consequences of this challenge?

A

Coming from a scientific perspective, we have a danger of false positives, and we have a danger of false negatives. We have a danger that we might convey something as being a danger that really isn’t, and people might design policy toward that—spending a lot of money, exerting a lot of resources, or making people move, and then it turns out not to happen. That’s obviously something that would be inefficient, and we’ll lose public trust.

Then we have the false negative, which is where there are unforeseen dangers that maybe we could have gotten ready for or we could have reduced our emissions in order to avoid them but we didn’t, and then we get hit. Both of those false outcomes happen. When scientists are on target, we communicate accurately. For example, the likelihood of wildfires being higher by now has been predicted since the beginning of the IPCC reports. Sea level rise has been similar.

Q

You’ve worked on major climate reports and have seen many others that have projected catastrophic outcomes. What makes it so hard for people to truly come together and react to these types of warnings?

A

In some sense, it’s not in your face. It’s not like a meteor that’s going to hit Earth like in the movie Don’t Look Up. People ignored it in the movie, sure, but there’s an endpoint of when it comes and you can see it above you and then it happens all at once. Climate isn’t like that. The risks are shifting. Things that weren’t previously very likely become likely. And responses take a long time and require concentrated efforts to adapt or mitigate against.

For instance, switching our energy system off of fossil fuels will take decades—we’ve known that all along. So, it’s a different kind of problem. There’s a bigger consequence and it’s slower to evolve. Humans and human society and human decision making didn’t evolve to worry about things on those timescales. It’s a blind spot for all of our structures in society. Our science, however, has gotten good enough to project ahead and potentially start to think like that. We can see things coming before they get here in full force and maybe act to prevent them. That’s great, but humans have a hard time imagining the scope of things that are slow and complicated.

Q

How can that change?

A

Speaking for myself, when you see big natural disasters where there are catastrophic risks or catastrophic failures to respond, one way to see that is nature is punishing us. But a different way to see it is that we should have been ready for that.

That’s what I think makes a good response—the level to which you get ready for something. Preparedness can save an awful lot of lives and an awful lot of money, so it’s the mindset we all need to develop when it comes to climate change. When I think of what climate science is built to do, it’s meant to make us better at that optimization problem. We’re really trying to get ahead of the problems and develop appropriate responses. In a lot of ways, it comes back to communicating uncertainty. It’s why we spend a huge amount of time trying to get precise about what we don’t know.