Investment Decision Making Resarch

Download as pdf or txt
Download as pdf or txt
You are on page 1of 41

Public Disclosure Authorized

WPS6193

Policy Research Working Paper 6193


Public Disclosure Authorized

Investment Decision Making


Under Deep Uncertainty
Application to Climate Change
Public Disclosure Authorized

Stéphane Hallegatte
Ankur Shah
Robert Lempert
Casey Brown
Stuart Gill
Public Disclosure Authorized

The World Bank


Sustainable Development Network
Office of the Chief Economist
September 2012
Policy Research Working Paper 6193

Abstract
While agreeing on the choice of an optimal investment benefit analysis with real options, robust decision
decision is already difficult for any diverse group of making, and climate informed decision analysis. It also
actors, priorities, and world views, the presence of provides examples of applications of these methodologies,
deep uncertainties further challenges the decision- highlighting their pros and cons and their domain of
making framework by questioning the robustness of all applicability. The paper concludes that it is impossible
purportedly optimal solutions. to define the “best” solution or to prescribe any
This paper summarizes the additional uncertainty particular methodology in general. Instead, a menu
that is created by climate change, and reviews the tools of methodologies is required, together with some
that are available to project climate change (including indications on which strategies are most appropriate in
downscaling techniques) and to assess and quantify the which contexts.
corresponding uncertainty. This analysis is based on a set of interviews with
Assuming that climate change and other deep decision-makers, in particular World Bank project
uncertainties cannot be eliminated over the short leaders, and on a literature review on decision-making
term (and probably even over the longer term), it then under uncertainty. It aims at helping decision-makers
summarizes existing decision-making methodologies identify which method is more appropriate in a given
that are able to deal with climate-related uncertainty, context, as a function of the project’s lifetime, cost, and
namely cost-benefit analysis under uncertainty, cost- vulnerability.

This paper is a product of the Office of the Chief Economist, Sustainable Development Network, and part of the Green
Growth Knowledge Platform’s (GGKP’s) affiliated program on Data and Decision-Making Tools for Green Growth. The
GGKP (www.greengrowthknowledge.org) is a joint initiative of the Global Green Growth Institute, Organisation for
Economic Co-operation and Development, United Nations Environment Programme, and the World Bank. Publication
of this paper is part of a larger effort by the World Bank to provide open access to its research and make a contribution to
development policy discussions around the world. Policy Research Working Papers are also posted on the Web at http://
econ.worldbank.org. The author may be contacted at shallegatte@worldbank.org.

The Policy Research Working Paper Series disseminates the findings of work in progress to encourage the exchange of ideas about development
issues. An objective of the series is to get the findings out quickly, even if the presentations are less than fully polished. The papers carry the
names of the authors and should be cited accordingly. The findings, interpretations, and conclusions expressed in this paper are entirely those
of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and
its affiliated organizations, or those of the Executive Directors of the World Bank or the governments they represent.

Produced by the Research Support Team


Investment Decision Making Under Deep Uncertainty – Application to
Climate Change
Stéphane Hallegatte, Ankur Shah, Robert Lempert, Casey Brown, Stuart Gill

Keyword: decision-making under uncertainty, investment, climate change, adaptation

JEL: D81, H54, O22, O18, Q54

Stéphane Hallegatte1, Ankur Shah2, Robert Lempert3, Casey Brown4, Stuart Gill2
1
The World Bank, Sustainable Development Network, Office of the Chief Economist
2
The Global Facility for Disaster Risk Reduction, The World Bank
3
RAND Corporation, Santa Monica Office
4
International Research Institute for Climate and Society, University of Massachusetts

The authors would like to thank Roberto Aiello, Arturo Ardila, Ken Chomitz, Patrice Dumas, Marianne
Fay, Erick Fernandez, Nagaraja Harshadeep, Abas Jha, Todd Johnson, Marcelino Madrigal, Shomik
Mehndiratta, John Nash, Niels Holm-Nielsen, Apurva Sanghi, Pascale Scandizzo, Tomas Serebrinsky,
and Xiaoping Wang for their time, insights and contributions.
1. INTRODUCTION

deep uncertainty n a situation in which analysts do not know or cannot agree on (1) models that
relate key forces that shape the future, (2) probability distributions of key variables and parameters in
these models, and/or (3) the value of alternative outcomes.

Many investment decisions have long term consequences. Infrastructure in particular can shape
development for decades or centuries, a duration that often extends beyond infrastructure’s lifetime
because the economic system reorganizes itself around them.

Making decisions on infrastructure therefore requires anticipating the long term environment, needs
and constraints under which it will function. This need for anticipation brings large uncertainty in the
decision-making process, for instance from demographic or economic projections. Past evidence
suggests indeed that our ability to predict the future is rather limited and that these limitations need
to be taken into account in the way decisions are made. An illustration of these limits is provided in
Figure 1, which shows scenarios from the 1970s for US primary energy use. These scenarios were
designed on the basis of past relationships that were revealed to be mostly wrong. The 1973 oil shock
triggered innovation and behavioral and policy change that led to large increases in energy efficiency.

Figure 1: Scenario created in the 1970’s for primary energy use in the US, and actual use in 2000
(Source: Craig et al., 2002).

Decision-makers have always been used to managing such uncertainty, and they do so by using
various decision-making methodologies, from simple heuristics (e.g., adding safety margins to all
2
design characteristics to cope with larger-than-expected extreme events) to more sophisticated
methods (e.g., based on subjective probability and cost-benefit analysis). But today, climate change is
bringing another layer of deep uncertainty that make decisions even more difficult. The possibility of
rather radical changes in the environmental conditions under which infrastructure performs cannot be
ruled out, and its design needs to take this possibility into account.

Facing large differences between climate models (sometimes even on the sign of precipitation
changes), the introduction of climate change projections into project design has revealed challenging,
and today few examples of successful taking into account of global change can be identified.

Assuming that climate change and other deep uncertainties cannot be eliminated over the short term
(and probably even over the longer term), this paper aims at summarizing existing decision-making
methodologies that are able to deal with uncertainty, and especially with climate uncertainty. It also
provides examples of application of these methodologies, highlighting their pros and cons. It aims at
helping decision-makers identify which method is more appropriate in a given context, as a function of
the project lifetime, cost, and vulnerability. This analysis is based on a set of interviews with decision-
makers, in particular World Bank project leaders, and on a literature review on decision-making under
uncertainty.

We first present the principles of decision-making under uncertainty, with some of the possible
consequences for investments, some sources of uncertainty, and how approaches to quantify deep
uncertainty have fared. We then examine trade-offs between “robustness” and “optimality” focused
approaches, demonstrate an illustrative example of a robust decision process, and document some co-
benefits of robustness.

Second, we present the benefits and constraints of four available methodologies. Because the
methodologies investigated – CBA under uncertainty, CBA with a Real Options approach, Robust
Decision Making, and Climate Informed Decision Analysis – all have different strengths and
applicabilities, the development of a decision tool to choose amongst them is not a simple matter.

Finally, we propose cases studies. One is a case where the underestimation of uncertainty led to
design issues with a large-scale project, requiring costly retrofit investments. The three others are
applications of the methodologies proposed in the paper.

2. THE PROBLEM
In his seminal 1921 paper, Knight made a functional distinction between two levels of ignorance about
our uncertain future – that which can be reliably quantified (Knightian risk) and that which cannot
(Knightian uncertainty). For example, the number of car accidents each year in France, easily
calculable from ample historical data, is an example of Knightian risk, while an answer to many of the
fundamental questions regarding climate change would be neither reliable nor verifiable for many
years.

From a causal perspective, Knightian uncertainties may be either aleatory or epistemic in nature.
Aleatory uncertainties are thought to be irreducible due to the nature of complex systems: they
represent fundamentally complex or arbitrary behavior. Epistemic uncertainties, on the other hand,
3
are due to the insufficiency our models, ignorance in our choice of parameters, inaccuracy of our
measurements, or other remediable offenses.

In this work, we refer to “deep” uncertainty as the presence of one or more of the following three
elements: (1) Knightian uncertainty: multiple possible future worlds without known relative
probabilities; (2) Multiple divergent but equally-valid world-views, including values used to define
criteria of success; and (3) Decisions which adapt over time and cannot be considered independently.

The larger a role any of these factors play in the decision(s) in question, the “deeper” the uncertainty
may be considered. Conversely, the smaller a role these factors play, the “shallower” the uncertainty.
Clearly, climate change is a fantastic example of “very deep” uncertainty – with plenty of competing
viewpoints and values, no clear probabilities within any of them, and highly interrelated decision
series over time.

2.1 Effects of climate uncertainty on investments

Increasing uncertainty.
When designing climate-sensitive investments, we are accustomed to using historical weather and
climate data. Engineers use it in the design of infrastructure and buildings, the insurance industry to
calculate premiums and capital needs, and farmers depend on it to choose crops and scheduling. Even
national governments base their assessments of energy security requirements on such data.

With the projected changes in climate, however, historical data is no longer as useful for planning

Ideally, we would have well-behaved climate models that allow us to produce climate statistics for the
future. Unfortunately, two problems make it impossible to provide the equivalent of historical climate
data for future climates:

- First, there is a scale misfit between what can be provided by climate models (resolution of
~50 km for physical downscaling and ~ 10 km for statistical downscaling) and what is needed
by decision-makers.

- Second, and most importantly, climate change uncertainty is significant, due to both the
inherent uncertainty of the earth’s climate system and the limitations of our understanding of
that system as represented in climate model projections.

An Example: Climate Models in West Africa


Given the likelihood climate change in West Africa, a Ghanian hydraulic engineer would be wise to ask
climate modelers to predict precipitation rates for the next 100 years, instead of relying on historical
data. But using a climate model might be dangerously misleading: projections of future precipitation
changes in the region are very uncertain.

4
Figure 2: Change in annual rainfall in 2080-2100 (with respect to the 1980-2000 period) in Africa
according to two climate models (IPCC, 2007).

Figure 2 demonstrates change in annual rainfall in 2080-2100 (with respect to the 1980-2000 period)
in Africa according to two climate models (IPCC, 2007). For Ghana, CCSM3 predicts a 20% increase in
precipitation, while GFDL predicts a 30% decrease! Rather than building a water infrastructure project
specifically for one scenario, he would be well-served to engage in a process that determines a
diversity of scenarios over which the project should perform in a satisfactory manner.

In addition to climate-level uncertainties, most sectors also have to deal with the further difficulty of
predicting the responses of ecological systems to climatic changes. The responses of these systems are
often site-specific and non-linear, and we have very little in the way of models integrating climate and
ecological systems.

Who is affected?
Many decisions come with a long term commitment and can be very climate sensitive. Examples of
such decisions include urbanization plans, risk management strategies, infrastructure development for
water management or transportation, and building design and norms. These decisions have
consequences over periods of 50 to 200 years. Urbanization plans influence city structures over even
longer timescales. And infrastructure and urban plans influence the spatial distribution of activities
even beyond their lifetime (Gusdorf et al., 2008; Bleakley and Lin, 2010). These kinds of decisions and
investments are also vulnerable to changes in climate conditions and sea level rise.

For example, many building are supposed to last up to one hundred years and will have to cope in
2100 with climate conditions that, according to most climate models, will be radically different from
current ones. So, when designing a building, architects and engineers have to be aware of and account
for the future changes that can be expected.

Table 1 proposes a list of sectors in which decisions should already take into account climate change,
because they involve long-term planning, long-lived investments and some irreversibility in choices,
and are exposed to changes in climate conditions.

5
Sector Time scale Exposure
Water infrastructures (e.g., dams, reservoirs) 30–200 yr +++
Land-use planning (e.g., in flood plain or coastal areas) >100 yr +++
Coastline and flood defences (e.g., dikes, sea walls) >50 yr +++
Building and housing (e.g., insulation, windows) 30–150 yr ++
Transportation infrastructure (e.g., port, bridges) 30–200 yr +
Urbanism (e.g., urban density, parks) >100yr +
Energy production (e.g., nuclear plant cooling system) 20–70 yr +
Table 1: Illustrative list of sectors with high inertia and high exposure to climate conditions (from
Hallegatte, 2009).

How are we affected?

When faced with such large degrees of uncertainty that our normal prediction-based decision-making
tools become inoperative or irrelevant, we may either seek to reduce the uncertainty, or develop
“coping strategies” to make decisions in these situations. This project is about the second option,
namely the development of different decision-making strategies.

2.2 Sources of uncertainty and downscaling

Climate change uncertainties arise from three major sources:

 Future emissions of greenhouse gases, which are linked to demographic and socio-economic
evolutions, to available technologies, to values and preferences (e.g., development models),
and to policies. This uncertainty is linked to scientific uncertainty (what futures are possible?),
but also to a policy uncertainty, which is a positive uncertainty that represents our ability to
choose our future.

 Scientific uncertainty, which is created by our imperfect knowledge of the functioning of the
climate system and of affected systems. It is for instance the uncertainty on the response of
the global mean temperature to a given quantity of greenhouse gases (GHG) (including
“Climate sensitivity”, i.e. the increase in global mean temperature for a doubling of the CO2
concentration in the atmosphere), but also the uncertainty in the regional effects of global
warming, and the uncertainty on the reaction of affected systems, such as lakes, glaciers and
ecosystems.

 Natural variability, i.e. the fact that global climate variables have their own dynamics, linked to
the chaotic behavior of the climate system. Climate models provide information of statistical
nature (averages, variance, likelihood to exceed thresholds, etc.), but they do not provide
6
forecasts, i.e. deterministic prediction of the future. In other terms, they can estimate the
average number of rainy days in the summers of 2060s, but do not say anything about the any
given day or even any specific summer.

These three uncertainties are sometimes referred to as policy, epistemic, and aleatory uncertainty,
respectively. Their respective shares depend on the timescale and the space scale. At global scale (Fig.
3, left panel), and over the short term, natural variability and model response play the largest roles,
and the emission a very small role; over the long term, the emissions dominate other sources of
uncertainty.

Figure 3: The green is emission uncertainty, the orange is natural variability, and the blue is (climate)
model uncertainty; the variable is temperature change. Source: Hawkins and Sutton, BAMS, 2009.

It is thus critical not to over-interpret the difference between two climate scenarios run with different
emissions or different models. The difference might be caused by aleatory uncertainty, with no
significance. To interpret the difference between two scenarios in a rigorous manner, it is necessary to
use ensembles, i.e. a sufficiently large set of simulations run with the same model and the same
emission scenario. The spread of these simulations will represent the effect of natural variability as
simulated by the model, and only differences that are robust to this effect can be interpreted as the
effect of different emissions or of using different models.

Also, it is critical to account for the fact that the spread across models do not represent the full
uncertainty. All climate models use the same knowledge base, and are based on the same basic
methodologies (e.g., the discretization of ocean and atmosphere dynamics in a grid, parametrization
of many physical mechanisms such as convection). So it is very likely that all models share common
biases, making the epistemic uncertainty larger than differences across models. One example is the
IPCC 2007 report of sea level rise uncertainty: since all analyses misrepresented the role of ice
dynamics in sea level rise, all estimates of sea level rise reported in the IPCC 2007 shared the same
problem and their range could not be considered as a fair representation of the real uncertainty on
future sea level. The same problem may exist in other domains, for instance in how climate models
represent convection and monsoons in India or West Africa, making the model projection in these

7
regions particularly uncertain. Even when all models agree (e.g. on a decrease in rainfall in the
Mediterranean basis), there is still an uncertainty that needs to be considered.

At a regional (or continental) scale, the factors’ respective shares are different – natural variability is
much more important at regional than at global scale, and climate model uncertainty is still large (see
Fig. 3, right panel); emission uncertainty plays a more moderate role. It shows that when looking at
one countries or one region, it is much more difficult to predict future climates, regardless of future
progress in our understanding of climate change: natural variability means that the climate signal is
more difficult to extract (and – as already mentioned – forecasts of future climate remain out of
reach).

Also, there is a large uncertainty arising from differences between global climate models. The IPCC
provides results from 19 global climate models. Even though the models agree on the very big picture
(more warming in high latitude than in low latitude; more precipitations in high latitudes; less
precipitation around the tropics; more precipitation around the equator), the differences can be huge
in some regions (e.g., half of the models predicts an increase in precipitation over India; half of the
model predicts the opposite; and – as a consequence – the “average model” predicts no change,
showing the risk of using an average model).

When looking at local scale, we usually do not use global climate models. Instead, we use
“downscaling techniques”, which can be done with statistical tools or with regional climate models
(RCM). Statistical methods use statistical relationships, calibrated on historical data, to relate large-
scale drivers – which climate models can reproduce – to local phenomena – which climate models
cannot reproduce (Elsner and Jagger, 2006; Mestre and Hallegatte, 2008; Nuissier et al., 2011). Even
though our knowledge of the laws of physics helps selecting potential predictors, this method is not
directly based on physical laws. Such statistical methods are computationally efficient and have often
a good ability to reproduce the current climate. Statistical models, however, have two main
drawbacks: first, they need long series of reliable data; second, even with a sufficiently large data
range, it is difficult to know whether a statistical relationship will remain valid in a future climate.

To avoid the problem of validity of historical relationships, one may use physical models, which are
based on physical laws. Physical models are of particular interest when investigating extreme patterns
and variability changes. Of course, physical models often require calibration and bias correction, so
that the distinction between physical models and statistical models is sometimes fuzzy. Examples are
Regional Climate Models (RCMs); see (Knutson et al., 2010). RCMs are heavy to develop and use and
can only have resolution of about 25 km, but are less data-dependent. The confidence in RCMs is
generally higher, because they do not have to assume that the relationship between large-scale and
small-climate climate variables remain constant in a future – possibly very different – climate, which is
questionable. It is thus considered that statistical analyses are more reliable over the short to medium
term, while RCMs are necessary for large warming, over the long term. Nonetheless, on the long
term RCMs remain driven by the input from GCMs, and so they do not resolve uncertainty related to
climate variability, for example, that is produced by the GCMs.

In almost all cases, the use of these downscaling techniques improves the ability to reproduce the
current climate, but it does not mean that it improves the ability to project future changes.

8
In some cases, however, downscaling makes it possible to represent additional mechanisms and is
likely to improve both the ability of represent the current and future climate (for instance,
downscaling technique is required to look at precipitation in coastal or mountainous areas, or to look
at small-scale phenomenon like heavy precipitations). But this is only true when the regional climate
models are able to demonstrate the ability to reproduce the historical climate feature of interest
without elaborate bias correction.

But in all cases, downscaling cannot help with the uncertainty if global climate models disagree (like
in West Africa and the monsoon): it will refine a precipitation-increasing scenario and a precipitation-
decreasing scenario, but the difference between both will not be affected.

When downscaling is considered useful, we should rely on existing data and results when it is possible.
Many research-oriented downscaling projects exist in the scientific community, and it is often not
necessary to fund downscaling for planning or project purposes. If the question is on how extreme
precipitations will change, for instance, a climatologist will often be able to provide a review of
knowledge and provide in a few days what an expensive downscaling exercise would provide in
months. It is thus more useful to build capacity in client countries (e.g., through the funding of
research or training programs like the AMMA project1) than to finance downscaling exercises carried
out by developed-country consultants. Also, the multiplication of “operational” downscaling exercise
consumes resources that could be used to improve our understanding of the climate system and
research on downscaling methodologies. Decision-makers in need for downscaled information should
build on existing projects instead of creating new ones.

Capacity building in this field would be more efficient through the creation of local expertise
centers, financed as research & development institutions.

All this uncertainty does not mean that climate projections are useless. In many cases, climate model
information provides insights into what changes can be expected. Working on Europe, most climate
models project an increase in summer temperature variability: we need to look at heat wave
vulnerability; in North Africa, all models find a decrease in precipitation: we need to look at water
availability and irrigation; in India, many (but not all) models find a large increase in heavy monsoon
precipitation: we need to look at floods; etc. These model results help prepare for future climates,
even though they do not provide the information we would like to have in an ideal world.

One important recommendation, however, is always to complement model results with expert
knowledge.

When testing project robustness, looking also outside the range of model results is advisable: since all
models are based on the same knowledge, they may share the same error, and the future may be

1
AMMA is the African Monsoon Multidisciplinary Analysis, a research project financed by many developed countries (EU,
France, US, Japan, etc.) that includes research on climate science, agronomy, and social sciences in West Africa.
Interestingly, it helped finance the development of research groups in Africa, the creation of new university programs in
African Universities, and regular summer schools. Beyond direct results, this project has thus a significant capacity-building
legacy. See Redelsperger 2006 or www.amma-international.org.
9
outside the range projected by models.

2.3 Attempts to quantify uncertainty

Though uncertainty is often defined as risk that cannot be quantified (Knight 1921), most techniques
to handle uncertainty involve attempts to quantify it in one way or another. Deep uncertainties
challenge these methods.

Error Bars
It would be extremely useful to provide “error bars” on climate projection. Though some innovative
methods have been proposed (e.g., Allen and Stainforth, 2002; Forest et al., 2002; Tebaldi et al., 2004;
and others), there is no consensus on their value for operational decision-making.

Moreover, these “errors bars” should be produced not only from scientific information – they also
depend on political choices and subjective judgments. For instance, one might be more pessimistic
when considering a flood barrier for a 10-million inhabitant city than when considering the location of
a train line, for instance, because the consequences of failure are much large in the former case. It is
thus impossible to quantify climate uncertainty independently of the decision to be made, and of
subjective judgments that only decision-makers can make (IPCC, 2012).

Assigning Probabilities to Models


Some have proposed to use the various models to assess the uncertainty: if 50% of the models predict
an increase in hurricane intensity then we could attribute a likelihood of 50% to such an increase. This
method of deriving probabilities is simplistic. There is no reason why existing models should represent
the real uncertainty: they may all have the same flaw, since they are all based on the same incomplete
and imperfect knowledge.

Also, the ability of a model to reproduce past and present climates is an important criterion to assess
its value. But it is not because a model is better than another at reproducing present climate that it is
automatically also better at projecting future climate change. There is no consensus on how to rank
models according to their “quality.”

Calculating probabilities of future climate changes based on projections from climate models remains
an active research topic in the science community because of their perceived utility for decision-
making. However, due to the inability to assess the models’ skill in correctly predicting those
probabilities, the resulting probabilities are best viewed as subjective or expert judgment. In fact, it
may be entirely appropriate to use expert judgment as the basis for subjective probabilities that enter
into decision analysis. In many cases this inevitably occurs in regard to variables that are not climate
related (e.g., projected population growth, income levels, discount rates).

Long-term forecasting
It would be extremely useful to be able to forecast climate change over the next two decades in order
to implement adaptation plans. However, climate change is relatively limited at these time scales, and
climate variability is such that the climate change signal is not dominant (see Figure 3). Consequently,
climate models, which only reproduce natural variability at the statistical level, are incapable of
10
predicting changes in the near future. It is therefore essential not to over-interpret the results of
these models over the short-term, and not to use their output as forecasts, without taking into
account natural variability.

The inability of models to predict climate changes in the next two decades could change if work on the
ten-years forecast – a focus of research today – progresses. This would nevertheless require
considerable strides in numeric modeling and better knowledge of ocean conditions that determine
climate change on these time scales. Improved knowledge requires more developed measurement
networks in oceans worldwide.

2.4 Robustness and optimality

Traditional decision-making processes seek optimality.


Traditional decision-making processes work through the prediction of a future state, and the design of
plans or projects for the conditions of that state. This approach produces optimal results for the
intended future, but its application may be increasingly limited as we are faced with larger
uncertainties.

Predict Act

When addressing quantifiable uncertainty – i.e. what is commonly refer to as “risk” – this method can
be extended to consider multiple states characterized by a probability of occurrence. These
probabilities are sometimes determined by a frequency-based method (How often did the event E
occur in the past?), or by belief-based analysis such as Bayesian analysis (What are the odds of the
event E? How much do I trust my model?).

But as uncertainty gets more profound, we become less able to characterize its distribution (i.e. the
probability of occurrence), and lose corresponding confidence in our predictions. In such a situation,
the optimal solution may be designed for a world whose existence is uncertain, and may perform
poorly for other plausible, yet unanalyzed, worlds. The method is particularly dangerous because of
the well-identified tendency toward over-confidence in our ability to predict the future (Slovic et al.,
1981).

Accepting uncertainty mandates a focus on robustness.


A robust decision process implies the selection of a project or plan which meets its intended goals –
e.g., increase access to safe water, reduce floods, upgrade slums, or many others– across a variety of
plausible futures. As such, we first look at the vulnerabilities of a plan (or set of possible plans) to a
field of possible variables. We then identify a set of plausible futures, incorporating sets of the
variables examined, and evaluate the performance of each plan under each future. Finally, we can
identify which plans are robust to the futures deemed likely or otherwise important to consider.

11
As robust processes imply an acceptance of uncertainty, they also demand a process of dialogue to
determine which project vulnerabilities to consider, which performance metrics suggest success,
acceptable levels of risk, and which possible scenarios to evaluate. The stakeholder process is an
opportunity to further fortify the project against uncertainty, as a variety of viewpoints and concerns
can simultaneously be addressed in distinct scenarios. Incorporation of multiple scenarios builds
consensus on the outputs (the project) despite differing inputs (world-views, priorities, and desires).

Robust decision making processes – in the line of the long tradition in adaptive management theory –
are iterative and adaptive by nature: one round of analysis may suggest analysis of alternative project
directions, or combinations of available policies. Application to climate change adaptation has already
been proposed; see for instance Ranger et al. (2010) and a review in O’Brien et al. (2012). Since robust
processes map possible investment strategies to the climate that best favors them, integrating new
information may allow decision makers to switch from one strategy to another. Thus, action and
learning are carried out in parallel, and inform each other. In other terms, “waiting for more
information” is never an option since information has to be created with experimentation, monitoring,
and analysis; if information is not sufficient to make the investment decision, then an “information
creation” plan is required.

Learn Act Learn

Revise

One critique of robust approaches is their apparent “pessimism” and potential sensitivity to worst-
case scenarios. Unfortunately, rather than being an artifact of the methodology, this is the reality of
climate change adaptation (IPCC 2012°. Given deep uncertainty about future climate change and
relevant socio-economic trends, the choice among strategies may be unavoidably sensitive to poorly
understood worst-cases. Robust processes deal with this challenge through stakeholder participation
and exchange with experts. A negotiated, participatory process helps produce a chosen strategy,
which may or may not be expected to perform well in all the worst-case scenarios identified by the
analysis. However, a successful process for identifying robust strategies will help participants to
consider a wide range of options, and to understand the range of future conditions over which their
chosen strategy should be expected to perform well, as well as the residual risks that they have
chosen to accept.

We see both robust and optimal techniques as necessary elements in a decision-making process
involving deep uncertainties. While analyses focused on optimality are vulnerable to overconfidence
bias, they are in general simpler to conduct and may provide usefully starting points for robust
12
analyses. In contrast, robust approaches will often require more analysis, but can lead to better
understanding of risks and to strategies that manage a broader range of risks. In some cases, optimal
techniques may be sufficient, while in other cases robust approaches will offer significant value-
added. While expanding the range of risks considered may in some situations seem detrimental to
successful resolution of a decision process, purposefully ignoring important uncertainties may at least
as often prove detrimental to successful outcomes. Managed risk-taking is an essential part of
development, and inseparable from innovation.

2.5 An example of optimal and robustness analysis

Imagine a heavily forested catchment area where the government aims at regulating downstream
floods and providing irrigation for local farmers. Standard analysis based on historical rainfall and
runoff might predict a certain set of hydrological and meteorological conditions, scenario “A”, for the
next twenty years. Different scenarios are possible, of course, but their likelihood is either a) ignored,
or b) assigned ad-hoc probabilities for the sake of analysis.

Land management investment opportunities include building one or more dams, setting up canals and
pricing schemes for irrigation, and a forest management plan to control erosion and runoff from
timber harvesting.

A group of experts comes up with plans X, Y, and Z, reflecting different combinations and parameters
of the above options:

X One medium-size dam, some canals and contracts for irrigation water, with no forest
management component.
Y Two smaller dams, some canals and contracts for irrigation water, and a small forest
management component.
Z One small dam, large scale community-driven earthworks and irrigation ponds and a
large forest management component.
Table 2: Three options to at regulate floods and provide irrigation for local farmers

Taking climate uncertainties into account as well as possible shifts in timber management of private
landowners in the catchment area, responding to uncertain market conditions and government timber
policies, we may end up looking at the following scenarios, for the sake of illustration:

A Climate and land management along predictable lines from the past
B Heavier rainfall and increased demand for timber
C Lower rainfall and no change in timber demand
D Lower rainfall and increased demand for afforestation due to REDD effects
Table 3: Four scenarios for future climate and timber demand.

13
Analyzing the vulnerability of each of our plans to the different plausible futures gives us the following
matrix:

A B C D
Climate and Heavier Lower rainfall Lower rainfall
land rainfall and and no change and increased
management increased in timber demand for
along demand for demand afforestation
predictable timber due to REDD
lines from the effects
past
X ++++ -- ++ +
Y +++ + +++ ++
Z ++ ++ ++ +++

Table 4: Assessment of the performance of the three options in the four scenarios.

One first approach to decision-making would be to invest in research and investigation to determine
which one of the four possible futures is the most likely, and then to select the option that performs
best in this future. Most practitioners report that this is the demand they get from decision-makers,
who want to know what is the best prediction for the future, in order to select the best option in this
future. And if uncertainty was limited – i.e. if our knowledge base would make it possible to make
forecasts for the future – this approach would be appropriate. One could for instance conclude that
scenario A is the most likely, and that the option X is thus the most appropriate. This approach
amounts to disregarding all scenarios except the most likely, and to use a predict-then-act approach.

When uncertainty is larger, this approach does not work, because it is impossible to determine which
scenario is the most likely, or because several scenarios are equally plausible. In such a situation, one
option is to attribute probabilities to the different scenarios, and to use a cost-benefit analysis under
uncertainty to determine the “best” strategy. This method accepts the existence of uncertainty, and
deal with it using probabilities, which can be determined through various techniques.

A robust approach, on the other hand, might first prioritize a vulnerability analysis of project options,
as each plan is vulnerable to changes in climate, as well as to shifting demand and practices in the
agriculture and forestry sectors. In such an approach, we do not start by investigating the scenarios
plausibility or probability, but by investigating the vulnerability of various strategies in different
scenarios.

Indeed, plan X could lead to a dangerous negative in the excess rain scenario (with heavy erosion,
siltation, and flooding) and would suffer from excess capacity in scenarios C and D. The analysis thus
identifies heavier rainfall as the primary vulnerability. Those concerned about worst-case plausible
scenarios will eliminate X as an option, and decide between Y and Z, for example, based on relative
estimated probabilities of the scenarios and further expert input. Additionally, more plans could be
14
constructed from the original policy elements upon further sensitivity analysis and cost-benefit
breakdown of each element.

In addition, the analysis shows the potential for integration with the types of flexible strategy that are
the focus of real options techniques – it is easier to build a smaller damn first, with expansionary
capability, than building a larger dam of potentially needless size. Other techniques that are more
flexible (such as forest management policy) can be adapted to each “optimal” case depending on new
information on climate and demand, once the initial matrix of robustness has been generated.

2.6 Co-benefits of robustness

While shifting to more robust decision processes is primarily a technique to make better decisions
under deep uncertainty through finding project options' sensitivities to uncertainties, it also provides
a series of co-benefits, advantages over a traditional decision-making process focused on optimality.
These co-benefits may help offset the added time and cost and learning-curve of the techniques
involved.

a. Avoid excessive discounting


Many climate change adaptation projects or projects which otherwise attempt to avoid large but
unlikely events and costs over the longer term will be selected against by traditional discounting
methods employed in a CBA. While alternative discounting theories exist and are argued for2, they
have not gained wide acceptance in World Bank policy3. Robustness frameworks provide an
opportunity to avoid this negative effect of discounting by selecting for high return projects within a
subset of project options that are demonstrably robust over the long term. As such, we can account
for unquantifiable risk not captured in the discounting process.

b. Reduce vulnerability to over-confidence and surprises


Robust analyses typically utilize specialized local knowledge through stakeholder involvement to build
a cloud of possible vulnerabilities much broader than project planners could conceive alone. These
vulnerabilities are then examined to help confront our project with challenges experts may not have
discovered. The explicit integration of low-probability events thus allows for more comprehensive
vulnerability analysis and stronger projects in the end.

c. More complex definition of success & multiple worldviews


Because optimality approaches focus on optimizing for one variable (net present value or NPV), they
can miss the valuation of secondary project effects. For example, a public transport system aimed at
reducing commute time or greenhouse gas emissions may also support larger urban planning goals, an
unmeasured secondary benefit. In the same example, by providing investment access to different

2
For those interested in pursuing the subject further, the authors recommend Ramsey 1928, Harvey 1994, Arrow 1996,
Portney and Weyant, 1999; Weitzman 2001, Gollier 2002, OXERA 2002, U.K. Treasury 2003, Heal 2005, and Stern 2006.
3
World Bank Operation Policy 10.04, for example, mandates calculating “discounted expected present value of its
benefits, net of costs” (WB 1994). In a recent, and thorough, review of World Bank cost-benefit analyses, there was no
mention of alternative discounting calculations (WB 2010). In practice, the discounting scheme gives a very low weight to
long-term impacts, practically disregarding any climate change impact.
15
areas of the city, the project prevents resource and growth “lock-in”, a facet of a larger strategic goal.

Rather than disregarding what they cannot predict or measure, robust approaches evaluate a variety
of project options, across different uncertainties, incorporating multiple axes of success.

Robust approaches can incorporate multiple views of the future, allowing parties with different world-
views and priorities to engage in the same process, making it possible to agree on a “best” solution,
which is impossible with a NPV when there are conflicting ideas of success.

2.7. Robust approaches


One can identify options and measures that are most adapted to the current situation of deep
uncertainty. Some of them are listed in the following section, with a few illustrations. This list does
not pretend to be exhaustive, but suggests ideas for more robust strategies.

No-regret strategies
“No-regret” measures constitute a first category of strategies that are able to cope with climate
uncertainty. These strategies yield benefits even in forecasts reveal wrong. For example, controlling
leakages in water pipes is almost always considered a very good investment from a cost-benefit
analysis point-of-view, regardless of how climate changes. Land-use policies that aim at limiting
urbanization and development in certain flood-prone areas (e.g., coastal zones in Louisiana or Florida)
would reduce disaster losses in the present climate, and climate change may only make them more
desirable. Also, in many locations, especially coastal cities, building sea walls would be economically
justified by storm surge risks with the current sea level, and sea level rise would only make these walls
more socially beneficial. The identification of sub-optimalities in the current situation may help
identify adaptation options that are beneficial over the short term (and easier to implement from a
political point of view) and efficient to reduce long-term climate vulnerability.

Reversible and flexible strategies


Second, it is wise to favor strategies that are reversible and flexible over irreversible choices. The aim
is to keep as low as possible the cost of being wrong about future climate change. Among these
examples, one can mention insurance and early warning systems that can be adjusted every year in
response to the arrival of new information on risks. Another example is restrictive urban planning.
When deciding whether to allow the urbanization of an area potentially at risk of flooding if climate
change increases river runoff, the decision-maker must be aware of the fact that one answer is
reversible while the other is not. Refusing to urbanize, indeed, has a well known short-term cost, but
if new information shows in the future that the area is safe, urbanization can be allowed virtually
overnight. This option, therefore, is highly reversible, even though it is not costless since it may
prevent profitable investments from being realized. Allowing urbanization now, on the other hand,
yields short-term benefits, but if the area is found dangerous in the future, the choice will be between
retreat and protection, both of which may be difficult and expensive. Of course, it does not mean that
urbanization should always be rejected. It only means that, in the decision-making process, the value
of the reversibility of a strategy, often referred to as the “option value” (see below), should be taken
into account.

Safety-margin strategies
16
Third, there are “safety margin” strategies that reduce vulnerability at negative, null, or negligible
cost. There are already practical applications today. For instance, to calibrate drainage infrastructure,
water managers in Copenhagen now use runoff figures that are 70 percent larger than their current
level. Some of this increase is meant to deal with population growth and the rest is to cope with
climate change, which may lead to an increase in heavy precipitation over Denmark. This 70 percent
increase has not been precisely calibrated, because such a calibration is made impossible by climate
change uncertainty. But this increase is thought to be large enough to cope with almost any possible
climate change during this century, considering the information provided by all climate models. This
move is justified by the fact that, in the design phase, it is inexpensive to implement a drainage
system able to cope with increased precipitation. On the other hand, modifying the system after it has
been built is difficult and expensive. It is wise, therefore, to be over-pessimistic in the design phase.

The existence of cheap safety margins is especially important for adaptation measures that are not
reversible or flexible. The options that are irreversible (e.g., retreat from coastal areas) and in which
no cheap safety margins are available are particularly inadequate in the current context. The options
that are irreversible but in which safety margins can be introduced (e.g., coastal defenses or
improvement of urban water-management infrastructures) can be implemented, but only with a
careful taking into account of future climate change scenarios.

Strategies that reduce decision-making time horizons


The uncertainty regarding future climate conditions increases rapidly with time. Reducing the lifetime
of investments, therefore, is an option to reduce uncertainty and corresponding costs. This strategy
has already been implemented in the forestry sector by choosing species that have a shorter rotation
time. Since species choice cannot be made reversible and no safety margins are available in this
sector, this option is interesting in spite of its cost. In other sectors, it is also often possible to avoid
long-term commitment and choose shorter-lived decisions. For example, if houses will be built in an
area that may become at risk of flooding if precipitation increases, it may be rational to build cheaper
houses with a shorter lifetime instead of high-quality houses meant to last one hundred years.

3. METHODOLOGIES
This section proposes a review of some of the methodologies that have been proposed to cope with
deep uncertainty in investment decision. These include cost benefit analysis, real options analysis,
climate informed decision analysis, and robust decision making. While we necessarily present our list
as four distinct methodologies, in practice they can overlap. The latter two methods are both
“context-first” (Ranger et. al. 2010) robust decision approaches that differ mostly in the particular
analytic tools they employ and their relative emphasis on climate versus the combination of climate
and socio-economic uncertainties. The first two have routinely been incorporated within robust
decision analyses as means of valuing alternative strategies. The difference is that in their pure form
cost benefit and real options analyses yield a single value measure for each strategy. In robust
analyses, the cost-benefit and real option value of a strategy may vary over a wide range of future
conditions.

3.1 CBA and CBA under uncertainty (e.g., Arrow et al., 1996)

The cost-benefit analysis approach (private as well as public) involves the following steps: (i) identify
17
competing projects; (ii) identify sources of uncertainty and future possible states of the world; (iii)
evaluate the costs and benefits for each project; (iv) calculate the present value of costs and benefits;
(v) calculate the net present value of different competing projects; and (vi) evaluate the robustness of
the result.

For instance, assessing a protection of the city of New Orleans against category-5 hurricanes can be
done by comparing the cost of building and maintaining such a protection (C) with the benefits B, that
can be estimated as a discounted sum of the benefits (i.e. the avoided losses), over the lifetime of the
protection (from Hallegatte 2006):

n
T
 1 
B   pn   dn
0 1  

A first CBA can be done using “first-guess” parameters:


 A construction cost of $20 billion.
 an annual probability that a Category 5 hurricane hits New Orleans at about p = 1/500;
 a discount rate of 3 and 7 percent (following US regulations);
 an estimate of the direct cost of the New Orleans flooding at around $20 billion;
 A cost of human losses of $5 billion.

In that case, the expected present benefit of a Category 5 flood protection system in New Orleans can
be calculated at $1.3 billion with a 7 percent discount rate and $6 billion with a 3 percent discount
rate. This rough estimate clearly rules out an upgrade of the protection system to make it able to cope
with Category 5 storms.

Let us now look at the uncertainty on each of these parameters:


 Choice of the discount rate. As illustrated by our comparison of 3 and 7 percent
discount rates, the influence of this political choice is large. Since this choice is very
controversial and depends on ethical judgment on which it is difficult to reach
consensus, the assessment of the system benefit will remain very uncertain.
 Probability of occurrence. If we assume that climate change and subsidence may
multiply by 5 the probability of the floods currently caused by Category 5 hurricanes
over the 21st century (this is within the bounds of current estimates), then expected
benefits from protection against Category 5 hurricanes would rise from $1.3 to $2.4
billion or from $6 to $23 billion, depending on the discount rate (7 percent and
3 percent, respectively).
 The flood costs. Flood costs evaluated by insurance companies are poor proxies of
welfare costs, especially concerning large-scale events. A conservative estimate of the
actual overall cost of the New Orleans floods is at least two times the insurers’
approximation based on direct losses only; that is, $60 billion. Using the new values of
event probability and potential damages, the expected benefit of an upgraded
protection system would be $4.8 billion with a 7 percent discount rate and $46 billion
with a 3 percent discount rate.

18
 Countervailing risks and side effects. The implementation of a large-scale protection
system can attract more people in at-risk location, and increase vulnerability in case of
defense failure (Hallegatte 2011). But it can also attract more activities, improve
infrastructure, create jobs and income, and thus improve welfare more than one an
analysis of direct costs only can suggest. These effects are very difficult to estimate and
can easily double (or divide by two) expected benefits, making them range between
$0.6 billion to $92 billion.
 Risk aversion: A society that would use the previous method to assess a protection
system is called “risk-neutral.” A risk-neutral agent is indifferent to risk; i.e., it does not
see any difference between losing $1 with certainty and having a 10 percent chance of
losing $10, because the expected loss is the same in both cases. Including an aversion
to risk increases the benefit from protection.
 Heterogeneity of damages: Benefits are also larger if one takes into account the fact
that only a fraction of the population is affected by floods (and some of the losses
cannot be shared through solidarity mechanisms) and that the affected population is
often already poorer than average (see also Harberger, 1974; Harberger, 1986). Indeed,
it is not equivalent for a group of 10 people either to lose $1 each, or to know that one
of them will lose $10. Taking this into account – by using the sum of individual utility
functions instead of a single social utility function – can increase benefits by 50% (and
much more if basic needs are included in the analysis), making them reach more than
$140 billion.

Results from the CBA appear thus extremely dependent on parameters on which there is no scientific
agreement (e.g., the impact on climate change on hurricanes) or no consensus (e.g., the discount
rate).

Cost-Benefit Analysis under uncertainty


In such a situation of uncertainty, it is sometimes possible to attribute “subjective probabilities”, i.e.
beliefs on the likelihood of different possible “states of the worlds”, and to evaluate the expected
benefits as the probability-weighted average of the benefits in the different possible states of the
world.

If we assume that there is a P=1/3 probability that the category-5 probability stays constant, and a (1-
P)=2/3 probability that it increases up to 1-out-of-100-years in 2100, then the expected benefit can be
written:

n n
T
 1  T
 1 
B  P p n   d 0 1  g   (1  P) p' n   d 0 1  g 
n n

0 1   0 1  

Where pn stays constant (a “state of the world” in which climate change has no influence on
hurricanes) and pn’ increases up to 1-out-of-100-years (a “state of the world” in which climate change
increases the likelihood of cat-5 hurricane landfall). This method is widely used in situations of
quantifiable uncertainty. The problem is that – for climate change – we do not have a strong
methodology to assess these subjective probabilities. They cannot be fully based on the past, because
climate change is a new process for which we have no equivalent in the past. Models share common
19
flaws and their dispersion cannot be used to assess the real uncertainty.

Also, this method does not help when there is a disagreement on ethical judgment and world views: in
absence of an consensus on the objective (and an indicator measuring success), CBA is extremely
difficult to use.

Suggestions

These results suggest that a CBA is useful but should encompass the whole set of possible
assumptions to check its robustness. In situation with limited uncertainty, the CBA (and the robustness
analysis) can be helpful to identify the best investment opportunities. The situation is different in
presence of large uncertainties. The example of New Orleans and hurricane protection shows that CBA
can reach very different results for reasonable parameter values.

Here benefits from protection range from $0.6 billion to $140 billion depending on scientific
assumptions (e.g., the impact of climate change on hurricanes) and ethical judgment and world views
(e.g., the aversion to risk and inequality). This situation is common and the CBA can rarely be used to
make a decision in an objective way.

Moreover, when using probability distribution function for different outcomes, the CBA is extremely
sensitive to tails of distribution function. As suggested by Weitzman (2009), for instance, assuming
that climate change damages have a heavy tail suggests that all GHG emissions should be stopped
immediately.

CBA, however, can be extremely useful to collect information (and stakeholder opinion) on the
consequences of a project, and to help organize the debate, by linking the different opinions of
various groups on what should be done to different opinions about the parameters of the analysis
(e.g., the discount rate, or the amount of avoidable losses).

In a situation of deep uncertainty, CBA should therefore be understood as a complement and a tool to
open consultations and discussions, not as a replacement for them.

3.2 Methodology: real options4


In a context of increasing knowledge – and thus decreasing uncertainty – the decision on an
investment project is not between “investing” and “not investing”, but between “investing now” and
“investing later with more information.” To help making this type of decision, some have proposed to
mobilize the “real option” approach, which was initially developed for financial markets (Arrow and
Fisher 1974, Henry 1974, Ha-Duong 1998, Pindyck 2002, Gollier and Treich 2003, Dotsis et al. 2005).

A real option (RO) is the faculty, but not the obligation, to undertake a project of uncertain future
benefits at a known cost. Applied to decision-making, RO values the options created and destroyed by
a project, alongside its expected net present value.

4
This section partly relies on inputs from Pascale Scuandizzo.
20
The analysis itself does not differ from a classical cost-benefit analysis, except that the NPV includes
additional consideration, namely the options created and destroyed by the project. The project's
Extended NPV is calculated as follows:

ENPV = Expected Net Present Value + ( Value of Options created – Value of Options Destroyed )

Thus an investment which has positive net benefits (exploiting existing capabilities) but fails to create
new options may be less desirable than an investment with fewer direct benefits but which results in
increased options (the ability to explore different opportunities). In other terms, there is a value in
implementing a project that does not provide any benefit per se, but makes it possible to implement
another project at a later point in time.

A common example is for a government to buy a plot to prevent its development by private actors,
not to use it immediately but to make it possible to use it at a later stage (which would have been
made impossible in case of private development). Buying the plot has a cost, and creates no direct
benefits. Its assessment may thus appear negative. But doing so make it possible to for the
government to use the plot at a later stage (i.e. it creates the option of using it later), which is not the
case if the plot is not bought.

Another useful example is on climate change mitigation (Ha Duong et al., 1998). The project of
investing in R&D on renewable energy may have a negative NPV if considered in isolation from other
policies. But doing so may create new technologies that will be absolutely necessary in 20 years when
very ambitious climate policies are implemented. So an investment in renewable energy R&D is
desirable only because it will create the option of rapid decarbonization in the future, not because of
its own return.

In practice, this approach leads to two foci: “timeliness” and “flexibility”.

The “timeliness” focus, though typically seen as holding the ability to postpone all or part of an
irreversible investment decision until more information is available, can also apply to not postponing
immediate action when, for example, severe pollution presents a threat of irreversible species
extinction.

The “flexibility” focus evaluates all types of options that the project can create as a consequence of its
design, such as the options to exit, to suspend, to contract or to expand activity. In the case of climate
change, the approach may be particularly useful to evaluate adaptation options, such as the options
to cope, to rebound, to abandon, to retreat and to flexibly adjust along one or several dimensions.

Method Details

The implementation of the real option method is similar to a CBA, but it requires considering decision-
making over at least two period. The analyzed project (labeled project A, e.g., R&D on renewable
energy) can be implemented during the first period. Then, a set of projects (B, C, D, E) can be
implemented during the second period, and their costs and benefits depend on the first-period
decision, resulting in a decision-making tree.

21
Project B is
implemented with
NPV1A,2B

The project A is Project C is


implemented implemented
with NPV1A with NPV1A,2C

No project
implemented
Initial situation
Project D is
implemented
with NPV10,2D

Project E is
The project is A not implemented
implemented
with NPV10,2E

No project is
implemented

The optimal choice made during the second period is determined by the choice made in the first
period5:
 Assuming that NPA1A,2B > NPA1A,2C > 0, then implementing the project A during the first period
will lead to the implementation of the project B during the second period.
 Assuming that NPA10,2D > NPA10,2E > 0, then not implementing the project A during the first
period will lead to the implementation of the project D during the second period.

The difference (NPA1A,2B - NPA10,2D) is the real option value created by the project A at the first period :
the implementation of project A allows for the implementation of project B and its benefits. The total
value of project A is thus: NPV1A + (NPA1A,2B - NPA10,2D).

A special case is the case where project A is identical to project B.

5
Note that NPV of projects implemented at the second period are discounted to the present, i.e. to the first period.
22
The project A is
implemented No project
implemented
with NPV1A

Initial situation Project A is


implemented
with NPV10,2A
The project is A not
implemented
No project is
implemented

In that case, the value of the project A implemented during the second period (NPV10,2A) can be larger
than the value of the same project during the first period (NPV 1A), because the uncertainty has been
reduced and the project design can be improved. In that case, the value of project A during the first
period is: NPV1A – NPV10,2A. If NPV1A < NPV10,2A then it is better to wait until period 2 to implement
project A.

In practice, the method can be made more sophisticated (and much more complex) by considering a
continuous time instead of a series of periods.

Applicability
The real options approach can be applied to improve the accuracy of economic evaluation and add a
measure of robustness within an optimality-seeking framework in which:
 Uncertainty is more “dynamic” than “deep”: Our knowledge improves over time, i.e. decision-
makers have confidence that some of the uncertainty will be resolved by the passage of time
(e.g., thanks to better scientific knowledge on climate change, or to the observation of socio-
economic trends).
 The project involves significant irreversible investments or creates/destroys significant
capabilities that matter for future decision-making.

Benefits
 Attractive analytically because it can be readily incorporated into a social cost-benefit
framework.
 Allows for explicit valuation of created and destroyed capabilities (expressed as options) in
general investments, often not accounted for in standard CBA.

Constraints
 Benefits of increased information and higher ENPV after waiting assumes some uncertainty
will be resolved with time.
 Complexity is much larger, because multiple set of decisions need to be included in the
analysis, sometimes leading to problems that are difficult or impossible to resolve.

23
3.3 Methodology: Climate informed decision analysis

Climate Informed Decision Analysis (CIDA; also known as “decision scaling”) is a method of
incorporating climate change information into a decision-making process, by first identifying which
sets of climate changes would affect the project and then determining the likelihood of those sets. As
such, it connects “bottom-up” vulnerability analysis with “top-down” climate model information, and
retains the strengths of both approaches. As a process committed to acceptance of deep
uncertainties, CIDA does not attempt to reduce uncertainties or make predictions, but rather
determine which decision options are robust to a variety of plausible futures.

Method Details
Climate Informed Decisions Scaling (CIDA) has three major phases:
1. Determination of stakeholders concerns, mapping to observable indicators, assignment of
tolerance to groups of indicators.

2. Determine relationship of climate changes to indicators, the quantified climate sensitivity of


each plan. Climate sensitivity is determined by assessing each plan to a wide range of possible
climate changes. The climate conditions that are problematic to each plan are identified, as
well as opportunities associated with future climates. A decision map (contingency matrix) is
produced, identifying each decision's performance under different climate possibilities, as well
as the best decision for a given future climate (similarly to our Table 4). A map of which
decision options are optimal under which groups of climate conditions can be constructed.

3. Using GCMs (and possibly downscaling), stochastic modeling, or expert judgment, determine
plausibility (subjectively derived probability) of relevant groups of climate conditions identified
in (2). The plausibilities are seen as the best possible use of the uncertain climate projections.
Decision option is then based on application of decision-to-climate performance to relative
climate plausibilities.

Applicability
The primary applicability is for decisions regarding long-term investments which may have climate
vulnerabilities. While standard decision-analysis requires well-characterized uncertainties, CIDA was
developed to handle poorly-characterized climate change uncertainties and to make the best use of
available climate information. It can be used as a framework for climate risk analysis of a planned
project, or to help decide among multiple project options.

Benefits
 Determines the utility of downscaling a GCM for the decision in question, potentially avoiding
an unnecessary investment, and ensures climate information produced is directly relevant to
the decision process.
 Identifies climate vulnerabilities of a given project without relying on uncertain GCM
projections.
 Distinct vulnerability analysis and climate projections allow for easily updated analysis when
new (and presumably better) GCM projections become available.
 Applies the GCM information late in the process, reducing the impacts of GCM uncertainties
24
on the decision as a whole.
 Allows alternative visions of the future (e.g., probability based on GCM projections vs
continued historical climate vs uniform probability to all possible futures) to be incorporated
into the decision.
 Outputs a clear mapping of decision options to climate futures, facilitating an adaptive
management process to react to observed changes in climate indicators.
 Explicitly addresses the limits of our ability to anticipate the future for any project.

Constraints
 Our ignorance regarding the plausibility of different climate scenarios forces CIDA to rely on
subjective judgment to determine which scenarios to take seriously.
 Quality of the initial stakeholder process determines the relevance and efficacy of the entire
decision process.
 Requires quantitative modeling of the project of interest and its response to climate change.

Resources and contact


Analysis can take a couple of months if all modeling tools are available (at a cost below $100k), and up
to 1-2 years if everything has to be created (and a cost up to $200k). Contact : Casey Brown, University
of Massachusetts, Amherst, cbrown@ecs.umass.edu.

Example: Climate Risk Assessment of Niger Basin Investment Program, 2010, AFTWR
(http://www.worldresourcesreport.org/files/wrr/papers/wrr_brown_uncertainty.pdf)

Phase 1
The decision-scaling process for the Niger Basin investment program began with the elicitation of the
priority concerns and key decision thresholds of the stakeholder countries through a workshop
conducted with the Niger Basin Authority in Ouagadougou, Burkina Faso in May 2010. Small group
discussions were convened to attempt to define thresholds of acceptable versus unacceptable
performance in terms of the identified project objectives. It was ultimately agreed by the workshop
participants that decreases in average performance of less than 20% from the baseline conditions was
considered an acceptable level of risk, while decreases of 20% or greater were defined as
unacceptable.

Phase 2
The next step was to model the response of the basin investment plan performance to changes in
climate conditions, to find which conditions led to unacceptable performance metrics. The existing
water resources systems model of the Niger Basin Authority revealed a nearly linear relationship
between the values of the performance metrics and changes in streamflow, in addition to
demonstrating that basin-wide averages of precipitation and temperature provided very good
approximations of annual streamflow throughout the basin. Climate sensitivity was quantified,
relating the performance metrics to changes in basin-wide temperature and precipitation.

Phase 3
In the final step of the analysis, a multi-model, multi-run ensemble of climate projections with 38
members was used to assess the expectation of future climate conditions. Because the climate
25
response function was defined in terms of the basin-wide precipitation and temperature changes, and
the Niger Basin is even larger than the units of the GCMs, the GCM projections were actually scaled up
through spatial averaging to produce estimates of changes over the entire basin. This improves the
credibility of the projections and allows significantly easier processing of the projections. The values of
precipitation and temperature from each GCM were input to each performance metric’s climate
response function, and the results organized into categories according to risk.

Findings
The results of this process made clear that, based on GCM projections for West Africa, there was
relatively small risk due to climate change to the planned investments.

3.4 Methodology: Robust decision making

Robust Decision Making (RDM) provides a decision framework developed specifically for decisions
with long-term consequences and deep uncertainty (Schwartz 1996, Lempert and Schlesinger 2000,
Lempert et al. 2006, Groves and Lempert 2007, Lempert and Collins 2007, Hallegatte 2009). An RDM
analysis begins with an existing or proposed project plan and exhaustively explores its vulnerabilities
and sensitivities, through stakeholder involvement and analytical methods. It then uses this
information to identify potential vulnerability-reducing modifications to the plan. The proposed
modifications are then presented to decision-makers for evaluation of adoption.

RDM is designed to support stakeholder dialogues that help define project objectives, including the
range of uncertainties considered and the scenarios that best describe any vulnerabilities. As a
“context-first” process that inverts the traditional ordering of an analytic decision-making process,
RDM only considers probability distributions in the final steps of the analysis, thus facilitating the use
of imprecise or missing probabilistic information, as well as facilitating engagement among
stakeholders who may hold differing expectations about the future.

Method Details
Robust Decision Making (RDM) has four major phases:
1. Stakeholder process to determine a) Candidate strategy, or different “policy levers” available,
b) Performance metrics used for evaluation, and c) Range of uncertainties to consider.
2. Configure models and create large database of simulation runs, examining candidate strategy
or set of policy levers over wide range of alternative future states.
3. Algorithmically determine clusters of scenarios where candidate strategy demonstrated
vulnerability.
4. Identify options for reducing vulnerabilities, and associated tradeoffs.

Key to the RDM analysis is the iterative process: lessons which emerge from step 4 and sent back to
step 1 for another round of analysis, until vulnerabilities are below acceptable levels.

Applicability
RDM often proves most valuable in situations with:
1. Multiple, deep uncertainties (e.g. climate, economic, and technological) that may make it
26
difficult to apply traditional probabilistic analyses
2. Stakeholders who hold a variety of world-views, priorities, and definitions of success
3. A rich portfolio of decision options that make it possible to identify project plans robust over
many different futures
4. Long-term commitments that make it difficult to reverse near-term choices.

RDM normally involves a detailed quantitative analysis, but the underlying framework can inform
more “heuristic” – less resource-intensive – evaluations. RDM is especially useful when project plans
can be framed as a series of decisions over time, to take advantage of learning explicitly and adjust to
new information as it comes along.

Benefits
 Full vulnerability analysis of proposed projects
 Transparent, reproducible, and exhaustive scenario discovery reduces over-confidence bias
 Stakeholder process to define measures of success and potential futures builds consensus on
project action even under diverse assumptions and priorities
 Adaptive decision process explicitly addresses the limits of our ability to anticipate the future
for any project
 Project alternatives and plans evolve from existing project options.

Constraints
 Time and cost intensive
 Quality of the stakeholder process influences the relevance and efficacy of analysis, especially
regarding the range of policies available, uncertainties considered, and choice of worst-case
scenario
 Requires extensive quantitative modeling of project area.

Resources and contact


The cost of a full RDM analysis largely depends on whether or not a suitable simulation model is
available at the start of a project. The time and cost of RDM analyses has ranged from a few months to
a year and from on the order of $100k (where a simulation model already exists) to $500k (including
the development of the model). Contact: Robert Lempert, RAND, lempert@rand.com.

4. CASE STUDIES
4.1 Ho Chi Minh City and flood risks

Recognition of Risk
Ho Chi Minh City (HCMC) ranks fourth globally among coastal cities most vulnerable to climate
change. HCMC already experiences extensive routine flooding; In the coming decades increased
precipitation and rising sea levels could permanently inundate a large portion of the city’s, place the
poor at particular risk, and threaten new economic development in low-lying areas.

27
In response to these challenges, HCMC has over the last fifteen years developed plans for, and started
implementing, numerous infrastructure projects to mitigate the flood risk. The multi-billion dollar
investment plans in sewage and drainage infrastructure included, over the years:

 6000 km of canals and pipes covering 650 km2 in the city, to upgrade discharge capacity of the
storm sewer system and to address land up-filling.

 Roughly 172 km dikes and river barriers, mainly for tidal control.

 A tide control plan that uses at least 12 gates and 170 km of dikes to create a polder system.

These plans were based on the best predictions of future climate and development available to
planners at the time.

Challenges of Prediction
Recent analysis suggests however that climate change and urbanization will be larger than anticipated,
and some variables are already beyond the maximum that were considered in the design phase. These
surprises require significant revisions to the plans. At-risk infrastructure includes:

 The canals and pipes built principally to upgrade discharge capacity of the storm sewer system
may not be able to handle increased flows.

 Increases in precipitation and tide levels observed over the last decade already exceed those
projected and may over-top dikes and barriers.

 Future saline intrusion and rainfall intensity may be more severe than anticipated, potentially
rendering the poldering plans obsolete even before they have been approved.

Since the plan was created, the city has also experienced unprojected urbanization in low-density
areas, perhaps due to the illusion of safety associated with the presence of flood prevention
infrastructure. The HCMC Steering Committee for Flood Control (SCFC) is concerned the insufficiency
of the planned infrastructure may worsen flooding in some areas of HCMC. In this case, the
intervention's legacy will have been an increase in vulnerability.

Robustness
Today, the SCFC is preparing an Integrated Flood Management Strategy to synchronize the existing
master plans for storm sewer system, flood control system, and urban development. Aware of the
consequences of underestimating uncertainty, they have chosen a robust approach to address the
following:
 Their prior approaches to planning consistently under- or mis-estimated uncertainties
 Suggested plans proved brittle to broken assumptions, leading to costly realignment
 Difficulties in reaching consensus among diverse actors and agendas

Through an integrated, robust approach, the SCFC is accepting the role of persistent, deep
uncertainties as a new component in its planning process.
28
4.2. From CBA to regret minimization: The case of dam dimensioning
Nassopoulos et al. (2011) applies the idea of robustness to dam dimensioning in the water
management sector, a sector that is particularly sensitive to climate conditions. Since investments like
dams are made for very long time, they require the taking into account of future changes. With
climate change, hydro-climatic parameters would be modified, affecting runoff, soil moisture and
groundwater level. On account of quantitatively and qualitatively altered water resources and affected
water consumption, the conception of hydraulic infrastructure will have to be revised.

They use a cost-benefit analysis to determine the optimal size of a water reservoir in the Pyli basin, a
Mediterranean mountainous catchment in northern Greece, which is part of an important water
development project for the Acheloos River. The optimal size depends on construction costs, on the
economic value of water, on the discount rate, and on climate conditions. In this region, climate
change will affect runoffs, but this effect is uncertain: some models project a decrease in runoff by up
to 21%, while others project stable runoffs. This uncertainty complicates the design of a water
reservoir.

The water demand satisfied by the reservoir yields an economic benefit which is set to the discounted
value of water:

where r is the pure time preference, g is the growth rate of the economy, n is the income elasticity,
D(K,y) is the water demand that can be satisfied each year y by a reservoir of size K, and pw(y) is the
unit water price. It is considered that the unit water value is independent of the demand level and
grows at the same rate than the economy.

The optimal dam dimension is determined by the maximization of the net present value of the water
system: NPV = max (B(K)−C(K)), where C(K) is the cost of the dam, as a function of its size. Figure 4
shows how the NPV varies with the dam volume, for different values of the pure time preference and
for different models.

29
Figure 4: Net present value as a function of reservoir volume. Three models and the no-climate change
case (NOCC) are shown and three rates of pure time preference. The models (CNRMCM3, NCARPCM1
and CSIROMK35) exhibit different changes in variability and mean. The purpose is not to show the
precise NPV of each model but to illustrate, beside the usual pure time preference effect (lower NPV
and optimal volume for higher pure time preference), the reduction of NPV difference between models
under climate change. Indeed, the NPV range is much larger for a 0% pure time preference

When there is no dam, the net present value is B(0). The dam is worth being built if the net present
value obtained with the dam is higher than the net present value without dam, i.e. if:

max [B(K)−C(K)] − B(0) > 0.

The net present value NPV is the value of the full water system including the value of water and not
only the value of the man-made reservoir.

Applied to 19 climate models from the IPCC (2007), Table 5 shows how using different climate models
in such a cost-benefit analysis leads to very different choices in terms of optimal dimensioning,
highlighting the need to use multiple models to avoid potential maladaptation.

Change in optimal reservoir volume (%)

Climate models 0% pure time preference 6% pure time preference


BCCRBCM20 -12 -7
CCCMACGCM31 -15 -6
CNRMCM3 -23 -10
CSIROMK30 -16 -8
CSIROMK35 -12 -7

30
GFDLCM20 -14 -4
GFDLCM21 -21 -8
GISSMODELER -21 -8
INGVECHAM4 -22 -9
INMCM30 -4 -3
IPSLCM4 -20 -8
MIROC32MEDRES -6 -3
MIUBECHOG -18 -9
MPIECHAM5 -22 -8
MRICGCM232A -7 -4
NCARCCSM30 -12 -5
NCARPCM1 1 1
UKMOHADCM3 -10 -4
UKMOHADGEM1 0 0
Table 5: Percent change in optimal volume storage relative to a case with no climate change (historic baseline),
for a 10km valley length, and two rates of pure time preference.

In some models, the change in optimal volume can reach 23% with a 0% time preference and 10%
with a 6% time preference, showing that climate change will influence strongly the optimal design of
water infrastructure. There is therefore a potential for sunk-costs, in case a large reservoir is
constructed while actual climate change finally calls for a smaller reservoir. Correspondingly, there is a
potential for regret if a small reservoir is constructed in spite of a potential for satisfying a larger water
demand.

The paper then calculates the “regret” (or error cost) if the reservoir is designed using one of the
models, while another one is the correct one (assuming that one model is correct, which has no
reason to be true). The paper proposes to look for robustness by designing the dam using the volume
with which the potential for regret is the smallest (a minimax approach). Looking at the corresponding
models, two models have a maximum error cost that is the smallest, i.e. GFDLCM20 and CSIROMK35
(with a maximal error cost of 0.4%). With this volume, and assuming that the 19 models cover the full
uncertainty, the potential for regret is thus extremely small and the design is robust.

Looking at the NPV of the resulting water supply, the analysis shows that climate change can lead to
reduction of up to 30%, even assuming optimal adaptation. Such a decrease would have large
consequences in downstream industries (such as agriculture, energy production, industry, etc.). This
large possible reduction shows that even when a robust decision is made – i.e. a decision that has no
large potential for regret –, the considered system (here the local economy that depends on water
supply) may not be robust – i.e. the system is vulnerable to some of the scenarios.

Applied to this specific case, it shows that the analysis should not be restricted to a small system (the
dam), but should be extended to a broader system (the dam and the water users), to maximize the
robustness of the economy and the impact on welfare.

31
4.3. Climate Informed Decision Analysis applied to the management of the Great Lakes Basin

One example of successful incorporation of deep uncertainty into a decision-making process is the
application of CIDA to management of the Great Lakes Basin in the United States. In 2007, the
International Joint Commission (IJC) established an independent study board composed of United
States (U.S.) and Canadian members to review the operation of structures controlling Lake Superior
outflows and to evaluate improvements to the operating rules and criteria governing the system. The
study is known as the International Upper Great Lakes Study (IUGLS). As a result of the considerable
uncertainty associated with future climate and lake levels, as well as other sources of uncertainty such
as ecosystem responses and the state of the navigation industry, a process of selecting the optimal
plan based on a most probable future scenario was rejected in favor of a robust decision making
process.

Underlying the process is the premise that we are limited in our ability to anticipate the future and
therefore any recommended plan must perform well over a very broad range of possible futures.

The analysis of the Great Lakes Basin plan comprised three phases:
1. Identification of vulnerabilities by stakeholders and definition of acceptable and unacceptable
lake levels for each impact area
2. Quantifying climate sensitivity for each proposed plan
3. Estimations of different climate plausibilities by a group of experts based on a variety of
climate modeling approaches, including GCM studies, RCM studies and paleoclimate studies.

Identification of Vulnerabilities
In order to prioritize concerns for the regulation of Lake Superior, stakeholder experts were convened
from state, provincial, federal, and local government agencies, as well as special interest groups
(boating, hydroelectricity, navigation), and environmental groups (e.g. The Nature Conservancy).
Termed technical working groups, the experts were divided among the following impact areas:
ecosystems, hydro-power, commercial shipping, municipal and industrial water and wastewater
systems, coastal systems, and recreational boating and tourism.

The experts were tasked with identifying the vulnerabilities of the system to climate changes and
other changing conditions. A primary challenge was the quantification of vulnerabilities in
commensurate units. To address this issue, the stakeholder groups were asked to define vulnerabilities
in terms of lake levels, including the duration of the event. They first defined three levels of
vulnerability, or “coping zones”:
A: (acceptable)
B: (significant negative impacts, but survivable)
C: (intolerable without policy changes).

The stakeholder groups then defined what combination of lake level and duration led to the kind of
impacts consistent with the coping zone descriptions. The definition of coping zones allowed the
evaluation of regulation plan performance to be conducted in terms that are comparable across
impact sector and defined by the stakeholders. It is a product of the shared vision planning process.

32
It was also agreed that the cost of the lake level going over the bounds of a management plan was
potentially much higher than the cost of the lake level staying below the anticipated bounds of a
management plan.

Quantification of Climate Sensitivity


For the Great Lakes application, the quantification of climate sensitivity was performed to identify
climate conditions that cause failures in plan performance. The assessment was based on the creation
of a climate response function which estimated the consequences (lake levels and associated
performance metrics) of a given decision (regulation plan) for a given set of mean climate conditions
(denominated in Net Basin Supply (NBS) – the sum of inflow and outflow of the lakes). The function
thus related climate effects to the performance metrics influencing the decision.

Using a stochastic time series of NBS to represent changes in mean climate conditions, those climate
conditions that presented risks to each regulation plan were identified. Note that although climate
model projections have not been used in the analysis to this point, considerable information regarding
climate impacts may be revealed by the vulnerability of proposed plans to potential climate futures.

Climate Plausibilities
Once the climate states that cause risks for a regulation plan are identified, the plausibility (relative
probability) of those conditions is estimated through tailored climate information. Given the
uncertainty associated with the probability estimates even after maximizing credibility, the term
‘‘plausibility’’ is used in place of probability.

The decision maker, in this case the Study Board, was presented with plausibility estimates of climate
states associated with each regulation plan, and the sources of climate information (e.g., climate
projections, stochastic analysis, paleo-climate data, expert opinion) that assigned probability to that
state. The plausibility estimates may be adjusted based on different comfort levels of the Board
members – because of the lack of confidence in available information, and the sensitivity to the worst-
case scenario, the choice of plausible scenarios must be a participatory, negotiated process.

Dynamics and Adaptation


Two other features of the analysis are particularly relevant in this context: dynamics and adaptation.

The concept of a dynamic regulation plan involves the selection of a portfolio of plans, each optimized
for a certain range of conditions. The appropriate plan will be implemented based on the prevailing
climate conditions, according to a set of identified triggers. The ability to identify these trigger
points, and base decisions upon them, is a key feature of robust analysis.

Though dynamic regulation provides robustness to a broad range of anticipated climates, it is well
known that there are other uncertainties, including faulty assumptions and unforeseen surprises,
which threaten the success of the regulation plan. For this reason, the process recommends
incorporation of an adaptive management process into the regulation plan. The process consists of
long-term monitoring of regulation plan performance and mechanisms for implementing changes
when needed.

33
Conclusions
The climate risk of each regulation plan, according to a variety of sources of climate information, was
presented to the Study Board. The Study Board prioritized the plausible risks estimated using the
stochastic analysis over other climate information, such as climate projections based on General
Circulation Models, due to the uncertainty associated with the GCMs. Recommendation for a final
regulation plan is focused on two plans that performed best over a very wide range of future climates.
The Study Board also recommended that the regulation plan be coupled with an adaptive
management program to address future uncertainty. The Board expressed satisfaction in the use of
climate information to make this decision.

For more information, see Brown et al. (2011).

4.4 RDM for Water Planners in Southern California (for more information, see Feifel, K.,2010)
Traditionally, water planners have used historical stream-flow data and weather patterns to infer
seasonal water forecasts. However, as climate change is expected to change weather patterns, air
temperature, and precipitation patterns in an as yet relatively unpredictable fashion at the local scale,
water managers are seeking methods to incorporate the impacts of climate change into their current
planning processes.
In 2006, the RAND Corporation worked with the Inland Empire Utility Agency (IEUA) to test its Robust
Decision Making framework. In 2005, IEUA released its Regional Urban Water Management Plan
(UWMP), in response to an anticipated population increase of 800,000 to 1.2 million people by 2030.
The document outlines a plan to meet future water demands by improving water-use efficiency and
developing local resources.
The RDM analysis took the UWMP as its initial strategy, used climate information from the National
Center for Atmospheric Research, and employed a planning system from the Stockholm Environment
Institute to assess how different policy levers would perform under a variety of possible futures.
The first run of RDM evaluated the proposed management plan under four climate scenarios. Findings
generally indicated that if the impacts of climate change are minimal, the UWMP will meet its supply
goals for 2030. However, if climate change causes significant warming and drying trends, the UWMP
could perform poorly and miss many of its goals, causing economic losses.
Further runs of the model, using over 200 scenarios and eight additional management strategies,
were then performed. Under this analysis, cost was 20% greater than expected in 120 of the 200
scenarios. They found that the UWMP was particularly vulnerable to future conditions that were drier
with reduced access to imported water and when natural percolation of the ground water basin
decreased. Novel strategies varied from increasing water use efficiency, recycling storm-water for
ground water replenishment, and developing the region’s water recycling program. In all cases,
augmenting the UWMP with additional management strategies led to lower costs and reduced
vulnerability.
It was concluded that local solutions should not be overlooked when developing solutions to the
impacts of climate change. Local policies and management opportunities may be more cost effective,
reliable, and feasible when compared to other options.
Under the RDM analysis, the best management plan was adaptive and included near-term

34
implementation of more water use efficiency techniques. When water managers were presented with
these results, surveys indicated an increase in their confidence that they could adequately plan for the
effects of climate change despite the uncertainty in forecasts.

5. CONCLUDING REMARKS
An investment decision is already difficult for any diverse group of actors, with different priorities and
world views. But the presence of deep uncertainties linked to climate change further challenges the
decision-making framework by questioning the robustness of all purportedly optimal solutions.

Our traditional decision making strategies and criteria were developed to address shallow
uncertainties, in which we could trust a person or team to determine a reliable description of the
future, come up with relative probabilities for various states (ie the calculation of risk), and, generally,
look at each investment decision independently. In situations characterized by shallow uncertainties,
it is indeed convenient to predict a future situation to use as a base for designing an investment
project, optimizing the project to the situation.

Under situations characterized by deep uncertainty, however, we no longer have the luxury of
imagining a world to build an optimal project around. Instead, decision makers must engage in a
process to determine the worlds they wish to imagine, the investment options available, the fitness
criteria used. Only then may they then measure the robustness of the proposed options across
futures and world-views.

In this paper, we attempt to build on contemporary research in the field, to frame a discussion about
robustness. The paper reaches several conclusions.

First, climate change introduces deep uncertainty in investment decisions, through several channels
including (i) the future emissions of GHG; (ii) the response of the climate system to these emissions;
(iii) the local changes due to global climate change; (iv) other systems’ response to climate change
(e.g., ecosystems or coastlines).

Second, the resolution of climate models is not appropriate for most weather-sensitive investment
decisions, and “downscaling” is often carried out, to project climate variables at appropriate spatial
and temporal scales. But when downscaling is considered useful, it is preferable to rely on existing
data and results, instead of running project-specific downscaling exercises. Relying on climatologist
expert knowledge is often cheaper, easier, quicker, and as useful as sophisticated downscaling
techniques. Also, comparing two scenarios – run with different emissions or different models – need
to include the role of natural variability: only the comparison of a set of scenarios can conclude that
two models or two emission scenarios lead to outcomes that are significantly different. Finally,
downscaling does not reduce the uncertainty on future climate change at local scale. Where global
climate models disagree – e.g., on the sign of rainfall changes in West Africa – downscaled projections
will disagree. The deep uncertainty caused by climate change is thus unavoidable.

Third, expert-based analytic methods such as cost-benefit analysis are necessary, but they cannot
replace discussions among stakeholders. They should therefore be understood as a complement and a
tool to open consultations and discussions, not as a replacement for them. In many instances, analytic
35
methods can be used as a tool to organize and drive consultations and debates, more than as a tool to
make the decision in an objective manner.

Fourth, many methodologies to make decisions under deep uncertainty have been proposed, from
cost-benefit analysis under uncertainty to real option and robust decision-making. It seems impossible
to define the “best” solution nor to prescribe any particular methodology in general. In many real
applications, a mix of these methodologies has been employed. Indeed, part of our vulnerability to
deep uncertainties stems from our reliance on, and belief in, the validity of widely applicable
prescriptive solutions. To the contrary, we find it extremely unlikely that any single methodology
would be appropriate across the board. Instead, a menu of methodologies is required, together with
some indications on which strategies are most appropriate in which contexts, in the line of Ranger et
al. (2010). Providing such a menu is a long term objective of our work.
In such a context, we see both robust and optimal techniques as necessary elements in decision-
making. While analyses focused on optimality are vulnerable to overconfidence bias (especially
overconfidence in the probability of extreme events), robust approaches ask participants to a decision
to consider a wide range of consequences and make choices about how robust they wish to be against
them (Hallegatte 2011).

Using the methodologies discussed in this paper will require a better understand of decision-makers
themselves. In-depth interviews with World Bank team leaders on the demand for tools to address
deep uncertainty led to the following key findings:

 By the economic analysis stage, project traction has already been established, and major
(potentially robustness-enhancing) options have already been eliminated. It is thus necessary
to start thinking in terms of resilience and robustness from the initial stages of project design.

 Though there is currently low demand for deep-uncertainty analysis, it will increase as climate-
related projects will face increasing scrutiny and competition for funding in coming years; the
taking into account of climate risks and the uncertainty linked to them will thus grow over
time.

 As stated in the introduction, decision-makers are used to managing uncertainties, and they
often handle them through intuitive heuristics developed from extensive experience. For
instance, uncertainties in extreme flood levels are usually managed through the introduction
of safety margin in infrastructure design. But climate change brings new uncertainty, still
unknown from decision-makers. Until they have such experience with deep climate
uncertainties, real or simulated, their decision-making will suffer. We hope this paper may
serve to expose them to the nuances of deep uncertainties and their management, and help
them make better decisions.

Future research will be focused on developing heuristics to guide decision-makers on which


methodology is most appropriate for a given issue, and on the “seamless” integration of these
methodologies into decision-making process. Indeed, the decision-making methodology is supposed
to support – not to replace – the decision-making process, and changes in methodologies may have to

36
interact with changes in process.

REFERENCES
Arrow, K., and Fisher, A., 1974. Environmental preservation, uncertainty, and irreversibility. The Quarterly
Journal of Economics 88, 312-319
Arrow, K.J., M.L. Cropper, G.C. Eads, R.W. Hahn, L.B. Lave, R.G. Noll, P.R. Portney, M. Russel, R.L. Schmalensee,
V.K. Smith, R.N. Stavins, 1996. Benefit-Cost Analysis in Environmental, Health, and Safety Regulation,
American Enterprise Institute Books and Monographs.
Arrow, K.J., M.L. Cropper, G.C. Eads, R.W. Hahn, L.B. Lave, R.G. Noll, P.R. Portney, M. Russel, R.L. Schmalensee,
V.K. Smith, R.N. Stavins, 1996. Benefit-Cost Analysis in Environmental, Health, and Safety Regulation,
American Enterprise Institute Books and Monographs.
Bleakley, H., Lin, J., 2010. Portage: path dependence and increasing returns in US history. NBER Working Paper
No. 16314.
Brown, C., 2010. Decision-scaling for Robust Planning and Policy under Climate Uncertainty. World Resources
Report, Washington DC. Available online at http://www.worldresourcesreport.org.
Brown, Casey, William Werick, Wendy Leger, and David Fay, 2011. A Decision-Analytic Approach to Managing
Climate Risks: Application to the Upper Great Lakes. Journal of the American Water Resources
Association 47(3):524-534.
Craig, P.P., Gadgil, A., Koomey, J.G., 2002. What Can History Teach Us? A Retrospective Examination of Long-
Term Energy Forecasts for the United States. Annual Review of Energy and the Environment 27, 83–118.
Dotsis, G., V. Makropoulou, et al., 2005. The Timing of Environmental Policies in the Presence of Extreme
Events. Real Options: Theory Meets Practice; 9th Annual International Conference, Paris, France
Elner, J.B. and T.H. Jagger, 2006: Prediction models for annual U.S. hurricane counts. J. Climate, 19, 2935–2952.
Feifel, K. (2010). Using Robust Decisionmaking as a Tool for Water Resources Planning in Southern California
[Case study on a project of the RAND Corporation]. Product of EcoAdapt's State of Adaptation Program.
Retrieved from CAKE: http://www.cakex.org/case-studies/1029 (Last updated April 2010).
Feifel, K., 2010. Using Robust Decisionmaking as a Tool for Water Resources Planning in Southern
California [Case study on a project of the RAND Corporation]. Product of EcoAdapt's State of Adaptation
Program.
Gollier, C. & Treich, N., 2003. Decision-making under scientific uncertainty: the economics of the precautionary
principle. Journal of Risk and Uncertainty 27, 77-103
Gollier, C., 2002. Time Horizon and the Discount Rate. Journal of Public Economics 85, 463–473.
Groves, D.G., Lempert, R.J., 2007. A new analytic method for finding policy-relevant scenarios. Global
Environmental Change 17, 73–85.
Gusdorf, F., S. Hallegatte, A. Lahellec, 2008, Time and space matter: how urban transitions create inequality,
Global Environment Change 18(4), 708-719, doi:10.1016/j.gloenvcha.2008.06.005
Ha-Duong, M., 1998. Quasi-option value and climate policy choices. Energy Economics 20, 599–620.
Hallegatte, S. 2006, A Cost-Benefit Analysis of the New Orleans Flood Protection System. Regulatory Analysis
06-02. AEI-Brookings Joint Center, Mar 2006.
Hallegatte, S., 2009: Strategies to adapt to an uncertain climate change, Global Environmental Change 19, 240-
247
37
Hallegatte, S., 2011. How economic growth and rational decisions can make disaster losses grow faster than
wealth, Policy Research Working Paper 5617, The World Bank
Harberger, A.C., 1978: On the use of distributional weights in social cost–benefit analysis. Journal of Political
Economy, 86(2).
Harberger, A.C., 1984: Basic needs versus distributional weights in social cost-benefit analysis, Economic
Development and Cultural Change, 32(3):455-74.
Harvey, C., 1994. The reasonableness of non-constant discounting. Journal of Public Economics, 53, 31–51.
Heal, G., 2005. Intertemporal welfare economics and the environment, Handbook of Environnemental
Economics, Vol. 3, K.G. Mäler and J.R. Vincent (Eds.), Elservier B.V.
Henry, C., 1974. Investment decisions under uncertainty: the" Irreversibility Effect“. The American Economic
Review 64, 1006-1012
IPCC, 2007. Climate Change: The Physical Science Basis. Contribution of Working Group I to the Fourth
Assessment Report of the Intergovernmental Panel on Climate Change. Solomon, S., Qin, D., Manning,
M., Chen, Z., Marquis, M., Averyt, K.B., Tignor, M., Miller, H.L. (Eds.), Cambridge University Press,
Cambridge, United Kingdom and New York, NY, USA, 996 pp.
IPCC, 2012: Meeting Report of the Intergovernmental Panel on Climate Change Expert Meeting on Economic
Analysis, Costing Methods, and Ethics [Field, C.B., V. Barros, O. Edenhofer, R. Pichs-Madruga, Y. Sokona,
M.D. Mastrandrea, K.J. Mach, and C. von Stechow (eds.)]. IPCC Working Group II Technical Support Unit,
Carnegie Institution, Stanford, California, United States of America.
Knight, F.H., 1921. Risk, Uncertainty, and Profit. Boston, MA: Hart, Schaffner & Marx; Houghton Mifflin
Company.
Knutson, T.R., McBride, J.L., Chan, J., Emanuel, K., Holland, G., Landsea, C., Held, I., Kossin, J.P., Srivastava, A.K.,
Sugi, M., 2010. Tropical cyclones and climate change. Nature Geoscience 3, 157–163
Lempert, R.J., Collins, M.T., 2007. Managing the risk of uncertain thresholds responses: comparison of robust,
optimum, and precautionary approaches. Risk Analysis 27, 1009–1026.
Lempert, R.J., Groves, D.G., Popper, S.W., Bankes, S.C., 2006. A general, analytic method for generating robust
strategies and narrative scenarios. Management Science 52 (4), 514–528.
Lempert, R.J., Schlesinger, M.E., 2000. Robust strategies for abating climate change. Climatic Change 45 (3/4),
387–401.
Mestre, O. and S. Hallegatte, 2008, Predictors of Tropical Cyclone Numbers and Extreme Hurricane Intensities
over the North Atlantic using Generalized Additive and Linear Models, Journal of Climate 22(3), pp. 633–
648
Nassopoulos, H., P. Dumas, S. Hallegatte, 2012. Adaptation to an uncertain climate change: cost benefit analysis
and robust decision making for dam dimensioning. Climatic Change, doi: 10.1007/s10584-012-0423-7.
Nuissier,O., B. Joly, and A. J. V. Ducrocq, 2011: A statistical downscaling to identify the large-scale circulation
patterns associated with heavy precipitation events over southern France. Quart.J. Roy. Meteor. Soc.,
137, 1812–1827.
O’Brien, K., M. Pelling, A. Patwardhan, S. Hallegatte, A. Maskrey, T. Oki, U. Oswald-Spring, T. Wilbanks, and P.Z.
Yanda, 2012: Toward a sustainable and resilient future. In: Managing the Risks of Extreme Events and
Disasters to Advance Climate Change Adaptation [Field, C.B., V. Barros, T.F. Stocker, D. Qin, D.J. Dokken,
K.L. Ebi, M.D. Mastrandrea, K.J. Mach, G.-K. Plattner, S.K. Allen, M. Tignor, and P.M. Midgley (eds.)]. A
Special Report of Working Groups I and II of the Intergovernmental Panel on Climate Change (IPCC).

38
Cambridge University Press, Cambridge, UK, and New York, NY, USA, pp. 437-486.
OXERA, 2002. A Social Time Preference Rate for Use in Long-Term Discounting, A report for ODPM, DFT and
DEFRA.
Pindyck, R. S., 2002. "Optimal Timing Problems in Environmental Economics." Journal of Economic Dynamics &
Control 26: 1667-1697.
Portney, P. R. and Weyant, J.P., 1999. Discounting and Intergenerational Equity, Resources for the Future, 202pp.
pp. 75.
Ramsey, F. P., 1928. A Mathematical Theory of Saving. Economic Journal, 38, 543–59.
Ranger N., A. Millner, S. Dietz, S. Fankhauser, A. Lopez and G. Ruta, 2010. Adaptation in the UK: a decision-
making process, Policy Brief, Grantham Research Institute on Climate Change and the Environment and
Center for Climate Change Economics and Policy.
Redelsperger, J.L., Thorncroft, C., Diedhiou, A., Lebel, T., Parker, D.J. and J. Polcher (2006): African Monsoon
Multidisciplinary Analysis (AMMA): An International Research Project and Field Campaign. Published in
BAMS,Volume 87, Issue 12 (December 2006) , pp. 1739–1746
Schwartz, P., 1996. The Art of the Long View. Double Day, New York.
Slovic, P., Fischhoff, B., Lichtenstein, S., Roe, F.J.C., 1981. Perceived risk: psychological factors and social
implications [and discussion]. Proceedings of the Royal Society of London. A. Mathematical and Physical
Sciences 376, 17–34
Stern, N., et al. 2006. The Stern Review on the Economics of Climate Change. http://www.hm-treasury.gov.uk
U.K. Treasury, 2003. Green Book, Appraisal and Evaluation in Central Government, available on
http://greenbook.treasury.gov.uk/
Weitzman, M., 2001. Gamma Discounting, American Economic Review 91(1).
Weitzman, M.L., 2009. On modeling and interpreting the economics of catastrophic climate change. The Review
of Economics and Statistics 91, 1–19.

39

You might also like