An interested
reader sent me a thoughtful email in response to last
week’s post on global warming. The reader asked a number of questions and
expressed some concerns. I welcome feedback of this sort because it helps me
see how my thoughts and attempted explanations are received, and gives me the
opportunity to learn from others. In this week’s post I will attempt to clarify
and elaborate on some important points by answering questions paraphrased from
that correspondence.
Question 1:
Even if only a small percentage of models now being used predict very serious
consequences of global warming for public health, or drought, or forest fires,
or other types of ecosystem damage, or extinctions, etc., doesn't that imply we
would be wise to take action on these models just in case they might be right?
Response 1:
This question gets at a key attribute of how people respond to uncertainty:
what to do about high impact but low probability events? As I have stated
before, we are wired to have one of two responses to uncertainty: to ignore it
or to overreact. But what would a thoughtful, rational response be? It would
depend on the degree of impact and probability of occurrence, but also on the
cost of actions to reduce the risk.
As an example, consider airplane design and maintenance. Airplane
crashes are rare. Per mile, airline passengers are 50 times less likely to be
killed than people traveling by car. When traveling by plane, people tend to
ignore the risk of a crash or to be overly troubled by it, e.g., showing far
more anxiety then they would if traveling in a car and facing equivalent or greater dangers (since a plane travels more than 50 times faster than a car, the chances of dying at any given moment in each travel mode are more similar).
Nevertheless, there is a risk of crashing, and airlines, airplane
manufacturers, and government agencies have the opportunity to make rational
decisions about how to handle that risk. Why not make an indestructible plane,
or as comedian Steven Wright put it, “Why don't they make the
whole plane out of that black box stuff?”
Engineers
will give you a straight answer to this question, emphasizing impracticality
because of the weight of “black box stuff.” I, however, am skeptical that the
weight itself is an insurmountable obstacle. After all, a 747 is designed to
takeoff at a maximum weight of nearly 1 million pounds, and an Airbus A380 at
nearly 1.3 million pounds. It may be impossible to design the entire plane
using black box technology, but I imagine that the interior of a 747 or A380
could be gutted and refitted with a small black box-like cabin suitable for
just a few passengers who would be protected from nearly every form of harm.
The reason we dismiss this possibility is the cost. Operating costs for such a
large plane are enormous and usually shared by the hundreds of paying
passengers onboard. Not many people would pay 100 times the going price of
airfare to reduce already low chances of dying in a plane crash to near zero. Airplane maintenance schedules raise the same issues. More frequent and more extensive inspections would improve safety but would drive up costs. In the end, airplane maintenance schedules represent a rational balance between the two.
This
sort of analysis doesn’t encourage the elimination of every risk. Instead it
encourages balance. This balance is what we should be striving for
with respect to global warming. The real question about global warming is
whether the benefits of reduced risks outweigh the costs of taking action today.
In order to answer that question, we must highlight scientific uncertainty and
examine it carefully with public costs and benefits in mind, which is the
approach I advocated last week. I personally believe that we should be enacting
policies to address the potential damage from global warming. If we perform and
emphasize analyses on climate change like the airplane design and maintenance ones described
above, we will disempower unwavering global warming skeptics and foster the
development of rational policies.
Question 2:
To the scientists I know and admire, uncertainty is a challenge, a focus, and
something to highlight. Isn't this what scientists are especially proud to do?
What are you implying: that scientists are uncharacteristically uncomfortable
with and downplaying of uncertainty when it comes to global warming; that
science is easily undermined by self-interest and discomfort with uncertainty
in general; or that there is something unusual about climate change that makes
it more difficult to study in objective scientific ways?
Response 2: Last week, I focused on an explicit strategy chosen by climate scientists to
downplay disagreements among themselves when addressing the public about global
warming. In no way does this strategy apply to debates among scientists in
scientific arenas. They chose this strategy with the good intentions of
breaking out of the dueling expert media formula described last week, and of conveying
their genuine concerns to the public. This strategy had a cost, though, in
terms of framing the debate about the certainty of global warming’s existence,
when I believe a debate focused on the range of
possible climate change outcomes and potential costs of avoiding the more
extreme possibilities would have been more productive.
However, climate
scientists are not alone in their struggle with how to portray uncertainty to
non-scientists. In fisheries, scientists are often asked to recommend fishing
quotas for the next year. This is a daunting task because our current
understanding of the status of the fish population is always uncertain, our
ability to estimate how many new recruits will be added to the population is even
more unpredictable and, in many cases, we are asked to give this advice without
a clear idea of how the following year’s quota might be adjusted in response to
new information (e.g., not at all versus the
rocket science approach). Furthermore, various sectors of society, whether
they be different fishing fleets or non-fishing interests, will have different
opinions as to what quota, or quota system, will be best.
An example of error bars, shown in red |
Typically, fishery scientists will address uncertainties in two ways.
First, they will recommend a quota but will bound it with error bars, a
graphical technique that shows a range of values likely to contain the correct
answer. Error bars are an honest attempt to convey uncertainty. It is my
experience, though, that managers often view them skeptically as both an
admission that fishery scientists don’t know the right answer, and as latitude
to choose any quota value within the range of the bars.
Second, fishery
scientists, along with climate scientists and every other scientist I’ve ever met, will talk at length about
the uncertainties in their field, but from a scientific perspective focused on the
frontiers of discovery. This framing of uncertainty does not translate directly
into currencies of relevance to interest groups and policy makers. Whereas
scientists often express the need to dumb down the science for policy, in
reality it must be translated from the nuanced and complex scientific world to
the equally but differently nuanced and complex policy world.
In sum, it’s not
that scientists are self-interested or uncomfortable with uncertainty. It’s
that they do not have the expertise or do not make the effort to express
scientific uncertainty in useful ways for policy makers. In order to craft smarter
policies, we need more emphasis in the policy process on bridging the gaps
among scientific disciplines and especially the gap between the scientific and
policy worlds. Addressing uncertainty more explicitly is a key step in doing
so.
With many thanks for these good
questions,
Josh
No comments:
Post a Comment