Probabilistic Subsurface Forecasting - What Do We Really Know?
Probabilistic Subsurface Forecasting - What Do We Really Know?
Probabilistic Subsurface Forecasting - What Do We Really Know?
3 5
2.5
4
2
3
1.5
1 2
0.5
1
0
1 0 1
–0.5 0.6
0.6
0.2 0.2
–1 –1
–1 –0.8 –0.2 –1 –0.8 –0.2
–0.6 –0.6 –0.4
–0.4 –0.6 –0.6
–0.2 –0.2 0
0 0.2 0.4 0.2
–1 0.4 –1
0.6 0.8 0.6
1 0.8 1
8 9
7 8
7
6
6
5
5
4
4
3
3
2
2
1 1
1 1 0.6
0.6
0.2 0.2
0 0
–1 –0.8 –0.2 –1 –0.8 –0.2
–0.6 –0.6 –0.4 1
–0.4 –0.6 –0.2 –0.6
–0.2 0
0 0.2 0.2
0.4 –1 0.4 –1
0.6 0.6 0.8
0.8
1
be desirable, but may not always add much value to green- (3) If necessary, perform a more detailed analysis (e.g.,
field studies where the basic subsurface uncertainties remain three-level D-optimal or central-composite).
poorly constrained. (4) Create a proxy model (response surface) by use of lin-
A recognized problem with polynomial proxies is that ear or polynomial proxies, and validate with blind tests.
they tend to yield normal distributions because the terms (5) Perform Monte-Carlo simulations to assess uncertainty
are added (a consequence of the Central Limit theorem). and define distributions of outcomes.
For many types of subsurface forecasts, the prevalence of (6) Build deterministic (scenario) low/mid/high models
actual skewed distributions, such as log-normal, has been tied to the distributions.
documented widely. Therefore, physical proxies, especially (7) Use deterministic models to assess development alter-
in simple cases such as the original-oil-in-place (OOIP) natives.
equation, have some advantages in achieving more-realistic However, variations and subtleties abound. Most studies
distributions. However, errors from the use of nonphysical split the analysis of the static and dynamic parameters into
proxies are not necessarily significant, depending on the two stages with at least two separate experimental designs.
particular problem studied (Wolff 2010). The first stage seeks to create a number of discrete geologi-
A question raised about computing polynomial prox- cal or static models (3, 5, 9, 27, or more are found in the
ies for relatively simple designs such as FPB is that often literature) representing a broad range of hydrocarbons in
there are, apparently, too few equations to solve for all place and connectivity (often determined by rapid analyses
the coefficients of the polynomial. The explanation is that such as streamline simulation). Then, the second stage
not all parameters are equally significant and that some takes these static models and adds the dynamic param-
parameters may be highly correlated or anticorrelated. Both eters in a second experimental design. This method is
factors reduce the dimensionality of the problem, allowing particularly advantageous if the project team prefers to use
reasonable solutions to be obtained even with an apparently higher-level designs such as D-optimal to reduce the num-
insufficient number of equations. ber of uncertainties in each stage. However, this method
cannot account for the full range of individual interac-
Proxy Validation tions between all static and dynamic parameters because
At a minimum, a set of blind test runs, which are not used many of the static parameters are grouped and fixed into
in building the proxy, should be compared with proxy discrete models before the final dynamic simulations are
predictions. A simple crossplot of proxy-predicted vs. run. This limitation becomes less significant when more
experimental results for points used to build the proxy discrete geological models are built that reflect more major-
can confirm only that the proxy equation was adequate to uncertainty combinations.
match data used in the analysis. However, it does not prove Steps 2 and 3 in the base methodology sometimes coin-
that the proxy is also predictive. In general, volumetrics cide with the static/dynamic parameter split. In many
are more reasonably predicted with simpler proxies than cases, however, parameter screening is performed as an
are dynamic results such as recoveries, rates, and break- additional DoE step after having already determined a set
through times. of discrete geological models. This culling to the most-
influential uncertainties again makes running a higher-level
Moving Toward a Common Process design more feasible, especially with full-field full-physics
Some standardization of designs would help these methods simulation models. The risk is that some of the parameters
become even more accepted and widespread in companies. screened from the analysis as insignificant in the develop-
The reader of this paper likely is a reservoir engineer or ment scenario that was used may become significant under
earth scientist who, by necessity, dabbles in statistics but other scenarios. For example, if the base scenario was a
prefers not to make each study a research effort on math- peripheral waterflood, parameters related to aquifer size
ematical methods. Another benefit of somewhat standard- and strength may drop out. If a no-injector scenario is later
ized processes is that management and technical reviewers examined, the P10/50/90 deterministic models may not
can become familiar and comfortable with certain designs include any aquifer variation. Ideally, each scenario would
and will not require re-education with each project they have its own screening DoE performed to retain all relevant
need to approve. However, because these methodologies are influential uncertainties.
still relatively new, a period of testing and exploring differ- An alternative is running a single-stage DoE including all
ent techniques is still very much under way. static and dynamic parameters. This method can lead to a
The literature shows the use of a wide range of method- large number of parameters. Analysis is made more trac-
ologies. Approaches to explore uncertainty space range from table by use of intermediate-accuracy designs such as FPB.
use of only the simplest PB screening designs for the entire Such compromise designs do require careful blind testing
analysis, through multistage experimental designs of increas- to ensure accuracy although proxies with mediocre blind-
ing accuracy, to bypassing proxy methods altogether in favor test results often can yield very similar statistics (P10/50/90
of space-filling designs and advanced interpolative methods. values) after Monte Carlo simulation when compared with
A basic methodology extracted from multiple papers listed higher-level designs. As a general observation, the qual-
in Wolff (2010) can be stated as follows: ity of proxy required for quantitative predictive use such as
(1) Define subsurface uncertainties and their ranges. optimization or history matching usually is higher than that
(2) Perform screening analysis (e.g., two-level PB, or FPB), required only for generating a distribution through Monte
and analyze to identify the most-influential uncertainties. Carlo methods.
George Box (an eminent statistician) once said: “All models References
are wrong, but some are useful.” Bentley, M.R. and Woodhead, T.J. 1998. Uncertainty Handling
In all these studies, there is a continuing series of tradeoffs Through Scenario-Based Reservoir Modeling. Paper SPE 39717
to be made between the effort applied and its effect on the presented at the SPE Asia Pacific Conference on Integrated
outcome. Many studies have carried simple screening designs Modeling for Asset Management, Kuala Lumpur, 23–24 March.
all the way through to detailed forecasts with well-accepted doi: 10.2118/39717-MS.
results that are based on very few simulation runs. These Chu, C. 1990. Prediction of Steamflood Performance in Heavy Oil
studies tend to study the input uncertainty distributions in Reservoirs Using Correlations Developed by Factorial Design
great depth, often carefully considering partial correlations Method. Paper SPE 20020 presented at the SPE California
between the uncertainties. Although the quality of the prox- Regional Meeting, Ventura, California, USA, 4–6 April. doi:
ies used in these studies may not be adequate for quantitative 10.2118/20020-MS.
predictive use, it still may be adequate for generating reason- Damsleth, E., Hage, A., and Volden, R. 1992. Maximum Information
able statistics. at Minimum Cost: A North Sea Field Development Study With an
Other studies use complex designs to obtain highly accu- Experimental Design. J Pet Technol 44 (12): 1350–1356. SPE-
rate proxies that can be used quantitatively for optimization 23139-PA. doi: 10.2118/23139-PA.
and history matching. However, many of these studies have Demirmen, F. 2007. Reserves Estimation: The Challenge for the
used standardized uncertainty distributions (often discrete) Industry. Distinguished Author Series, J Pet Technol 59 (5):
with less consideration of correlations and dependencies. 80–89. SPE-103434-PA.
Higher-speed computers and automated tools are making Landa, J.L. and Güyagüler, B. 2003. A Methodology for History
such workflows less time-consuming so that accurate prox- Matching and the Assessment of Uncertainties Associated With
ies and a thorough consideration of the basic uncertainties Flow Prediction. Paper SPE 84465 presented at the SPE Annual
should both be possible. Whichever emphasis is made, Technical Conference and Exhibition, Denver, 5–8 October. doi:
the models that are used should be sufficiently complex to 10.2118/84465-MS.
capture the reservoir physics that influences the outcome Murtha, J. 1997. Monte Carlo Simulation: Its Status and Future.
significantly. At the same time, they should be simple Distinguished Author Series, J Pet Technol 49 (4): 361–370. SPE-
enough such that time and energy are not wasted on refining 37932-MS. doi: 10.2118/37932-MS.
something that either has little influence or remains funda- Otis, R.M. and Schneidermann, N. 1997. A process for evaluating
mentally uncertain. exploration prospects. AAPG Bulletin 81 (7): 1087–1109.
In the end, probabilistic forecasts can provide answers Rose, P.R. 2007. Measuring what we think we have found: Advantages
with names like P10/50/90 that have specific statistical of probabilistic over deterministic methods for estimating oil and
meaning. However, it is a meaning that must consider the gas reserves and resources in exploration and production. AAPG
assumptions made about the statistics of the basic uncer- Bulletin 91 (1): 21–29. doi: 10.1306/08030606016.
tainties, most of which lack a rigorous statistical under- Sawyer. D.N., Cobb, W.M., Stalkup, F.I., and Braun, P.H. 1974.
pinning. The advantage of a rigorous process to combine Factorial Design Analysis of Wet-Combustion Drive. SPE J. 14
these uncertainties through DoE, proxies, Monte Carlo (1): 25–34. SPE-4140-PA. doi: 10.2118/4140-PA.
methods, scenario modeling, and other techniques is that Vogel, L.C. 1956. A Method for Analyzing Multiple Factor
the process is clean and auditable, not that the probabil- Experiments—Its Application to a Study of Gun Perforating
ity levels are necessarily quantitatively correct. However, Methods. Paper SPE 727-G presented at the Fall Meeting of the
they are as correct as the selection and description of the Petroleum Branch of AIME, Los Angeles, 14–17 October. doi:
basic uncertainties. 10.2118/727-G.
Having broken a complex forecast into simple assumptions, Williams, G.J.J., Mansfield, M., MacDonald, D.G., and Bush, M.D.
it should become part of a standard process to refine those 2004. Top-Down Reservoir Modeling. Paper SPE 89974 pre-
assumptions as more data become available. Ultimately, like sented at the SPE Annual Technical Conference and Exhibition,
the example from exploration mentioned at the beginning, Houston, 26–29 September. doi: 10.2118/89974-MS.
we hope to calibrate ourselves through detailed look-backs Williams, M.A. 2006. Assessing Dynamic Reservoir Uncertainty:
for continous improvement of our forecast quality. Integrating Experimental Design with Field Development
Planning. SPE Distinguished Lecturer Series presentation given for
Acknowledgments Gulf Coast Section SPE, Houston, 23 March. http://www.spegcs.
The author would like to thank Kaveh Dehghani, Mark org/attachments/studygroups/11/SPE%20Mark%20Williams%20
Williams, and John Spokes for their support and stimulat- Mar_06.ppt.
ing discussions. Thank you also to Hao Cheng for our work Wolff, M. 2010. Probabilistic Subsurface Forecasting. Paper SPE
together on the subject and for supplying the graphics for 132957 available from SPE, Richardson, Texas. JPT
Fig. 1.