Tebaldi et al. [2005] present a Bayesian approach to determining probability distribution functio... more Tebaldi et al. [2005] present a Bayesian approach to determining probability distribution functions (PDFs) of temperature change at regional scales, from the output of a multi-model ensemble, run under the same scenario of future anthropogenic emissions. The main characteristic of the method is the formalization of the two criteria of bias and convergence that the REA method [Giorgi and Mearns, 2002] first quantified as a way of assessing model reliability. Thus, the General Circulation Models (AOGCMs) of the ensemble are combined in a way that accounts for their performance with respect to current climate and a measure of each model's agreement with the majority of the ensemble. We apply the Bayesian model to a set of transient experiments under two SRES scenarios. We focus on predictions of precipitation change, for land regions of subcontinental size. We highlight differences in the PDFs of precipitation change derived in regions where models find easy agreement, and perform well in simulating present day precipitation, compared to regions where models have large biases, and/or their future projections disagree. We compare results from the two scenarios, thus assessing the consequences of the two alternative hypotheses, and present summaries based on their averaging.
Nonlinear regression is a useful statistical tool, relating observed data and a nonlinear functio... more Nonlinear regression is a useful statistical tool, relating observed data and a nonlinear function of unknown parameters. When the parameter-dependent nonlinear function is computationally intensive, a straightforward regression analysis by maximum likelihood is not feasible. The method pre-sented in this paper proposes to construct a faster running surrogate for such a computationally intensive nonlinear function, and to use it in a related non-linear statistical model that accounts for the uncertainty associated with this surrogate. A pivotal quantity in the Earth’s climate system is the climate sen-sitivity: the change in global temperature due to doubling of atmospheric CO2 concentrations. This, along with other climate parameters, are estimated by applying the statistical method developed in this paper, where the computa-tionally intensive nonlinear function is the MIT 2D climate model. 1. Introduction. A fundamental question in understanding the Earth’s cli-mate system is quan...
... BibTeX | Add To MetaCart. @MISC{Ellner92lenns,a, author = {Stephen Ellner and ... Neural Netw... more ... BibTeX | Add To MetaCart. @MISC{Ellner92lenns,a, author = {Stephen Ellner and ... Neural Networks 2(5 Hornick, Stinchombe - 1989. 18, Lyapunov exponents in chaotic systems: Their importance and their evaluation using observed data Abarbanel, Brown, et al. - 1991. ...
Journal of the American Statistical Association, 2003
Historical records of weather such as monthly precipitation and temperatures from the last centur... more Historical records of weather such as monthly precipitation and temperatures from the last century are an invaluable database to study changes and variability in climate. These data also provide the starting point for understanding and modeling the relationship among climate, ecological processes and human activities. However, these data are irregularly observed over space and time. The basic statistical problem is to create a complete data record that is consistent with the observed data and is useful to other scientific disciplines. We modify the Gaussian-Inverted Wishart spatial field model to accommodate irregular data patterns and to facilitate computations. Novel features of our implementation include the use of cross-validation to determine the relative prior weight given to the regression and geostatistical components and the use of a space filling subset to reduce the computations for some parameters. We feel the overall approach has merit, treading a line along computational feasibility and statistical validity. Furthermore, we are able to produce reliable measures of uncertainty for the estimates.
Grace Wahba (née Goldsmith, born August 3, 1934), I. J. Schoenberg-Hilldale Professor of Statisti... more Grace Wahba (née Goldsmith, born August 3, 1934), I. J. Schoenberg-Hilldale Professor of Statistics at the University of Wisconsin-Madison (Emerita), is a pioneer in methods for smoothing noisy data. Her research combines theoretical analysis, computation and methodology motivated by innovative scientific applications. Best known for the development of generalized cross-validation (GCV), the connection between splines and Bayesian posterior estimates, and "Wahba's problem," she has developed methods with applications in demographic studies, machine learning, DNA microarrays, risk modeling, medical imaging and climate prediction. Grace grew up in the Washington, DC area and New Jersey, and graduated from Montclair High School. She was educated at Cornell (B.A. 1956), University of Maryland, College Park (M.A. 1962) and Stanford (Ph.D. 1966), and worked in industry for several years before receiving her doctorate in 1966 and settling in Madison in 1967. Although holding several visiting appointments, she has made Madison her home for over 50 years. She is the author of Spline Models for Observational Data which has garnered more than 8000 citations. Grace is treasured as an academic advisor and has mentored 39 Ph.D. students that have resulted in more than 330 academic descendants.
In many practical applications, spatial data are often collected at areal levels (i.e., block dat... more In many practical applications, spatial data are often collected at areal levels (i.e., block data) and the inferences and predictions about the variable at points or blocks different from those at which it has been observed typically depend on integrals of the underlying continuous spatial process. In this paper we describe a method based on Fourier transform by which multiple integrals of covariance functions over irregular data regions may be numerically approximated with the same level of accuracy to traditional methods, but at a greatly reduced computational expense.
Journal of the Royal Statistical Society: Series B (Methodological)
... random vectors. Page 9. 1992] CHAOS IN NOISY SYSTEMS 407 ... by a set of lower dimensional fu... more ... random vectors. Page 9. 1992] CHAOS IN NOISY SYSTEMS 407 ... by a set of lower dimensional functions. Because of the typical convergence rate expected for fN, the condition MON-+O can only be satisfied if M is asymptotically negligible relative to N. This suggests that ...
The North American Regional Climate Change Assessment Program (NARCCAP) is constructing projectio... more The North American Regional Climate Change Assessment Program (NARCCAP) is constructing projections of regional climate change over the coterminous United States and Canada in order to provide climate change information at decision relevant scales. A major goal of NARCCAP is to estimate uncertainties in regional scale projections of future climate by using multiple regional climate models (RCMs) nested within multiple atmosphere-ocean general circulation models (AOGCMs). NARCCAP is using six nested regional climate models at 50 km resolution to dynamically downscale realizations of current climate (1971-2000) and future climate (2041-2070, following the A2 SRES emission scenario) from four AOGCMs. Global time slice simulations, also at 50 km resolution, will be performed using the GFDL AM2.1 and NCAR CAM3 atmospheric models forced by the AOGCM sea surface temperatures and will be compared with results of the regional models. Results from this multiple-RCM, multiple-AOGCM suite will be statistically analyzed to investigate the cascade of uncertainty as one type of model draws information from another. All output will be made available to the climate analysis and climate impacts assessment communities through an archiving and data distribution plan. The climate impacts community will have these data at unprecedented spatial and temporal (hourly to six-hourly) resolution to support decision-relevant evaluations for public policy. As part of our evaluation of uncertainties, simulations are presently being concluded that nest the participating RCMs within reanalyses of observations. These simulations can be viewed as nesting the RCMs within a GCM that is nearly perfect (constrained by available observations), allowing us to separate errors attributable to the RCMs from those attributable to the driving AOGCMs. Results to date indicate that skill is greater in winter than in summer, and greater for temperature than for precipitation. Temperature and precipitation errors are uncorrelated from model to model; consequently, the multi-model ensemble has more robust skill than any single model.
Page 363. CHAPTER 15 Local Lyapunov exponents: Predictability depends on where you are Barbara A.... more Page 363. CHAPTER 15 Local Lyapunov exponents: Predictability depends on where you are Barbara A. Bailey The dominant Lyapunov exponent of a dynamic system measures the average rate at which nearby trajectories of a system diverge. ...
ABSTRACT Extreme precipitation can cause flooding, result in substantial damages and have detrime... more ABSTRACT Extreme precipitation can cause flooding, result in substantial damages and have detrimental effects on ecosystems. Climate adaptation must therefore account for the greatest precipitation amounts that may be expected over a certain time span. The recurrence of extreme-to-heavy precipitation is notoriously hard to predict, yet cost-benefit estimates of mitigation and successful climate adaptation will need reliable information about percentiles for daily precipitation. Here we present a new and simple formula that relates wet-day mean precipitation to heavy precipitation, providing a method for predicting and downscaling daily precipitation statistics. We examined 32,857 daily rain-gauge records from around the world and the evaluation of the method demonstrated that wet-day precipitation percentiles can be predicted with high accuracy. Evaluations against independent data demonstrated high skill in both space and time, indicating a highly robust methodology.
Journal of Computational and Graphical Statistics, 2015
A multi-resolution model is developed to predict two-dimensional spatial fields based on irregula... more A multi-resolution model is developed to predict two-dimensional spatial fields based on irregularly spaced observations. The radial basis functions at each level of resolution are constructed using a Wendland compactly supported correlation function with the nodes arranged on a rectangular grid. The grid at each finer level increases by a factor of two and the basis functions are scaled to have a constant overlap. The coefficients associated with the basis functions at each level of resolution are distributed according to a Gaussian Markov random field (GMRF) and take advantage of the fact that the basis is organized as a lattice. Several numerical examples and analytical results establish that this scheme gives a good approximation to standard covariance functions such as the Matérn and also has flexibility to fit more complicated shapes. The other important feature of this model is that it can be applied to statistical inference for large spatial datasets because key matrices in the computations are sparse. The computational efficiency applies to both the evaluation of the likelihood and spatial predictions.
ContentsChapter 1Spatial Process Estimates asSmoothers1.1 IntroductionSpatial statistics has a di... more ContentsChapter 1Spatial Process Estimates asSmoothers1.1 IntroductionSpatial statistics has a diverse range of applications and refers to the class of modelsand methods for data collected over a region. This region might be a mineral fieldfor a geologist, a quadrat in a forest for an ecologist or a geographic region for anenvironmental scientist. A typical problem is to predict values of a
IEEE Transactions on Components, Packaging, and Manufacturing Technology: Part C, 1996
Within-wafer uniformity is traditionally measured by the signal-to-noise ratio (SNR), defined as ... more Within-wafer uniformity is traditionally measured by the signal-to-noise ratio (SNR), defined as the estimated standard-deviation of within-wafer measurements over the mean of those measurements. Unfortunately, in the presence of deterministic variations of the response over the wafer (such as the bull's eye effect of some processes), the SNR is sensitive to both the location and the number of the measurements taken. A robust metric for describing within-wafer uniformity is developed and compared with the SNR method. The new metric, termed the integration statistic (I) is shown to be robust to both the location and number of measurements taken on the wafer and has lower variance than the SNR metric. The implications of this robust behavior are that fewer measurements can be taken to achieve a given accuracy in the uniformity estimate and that uniformity estimates are consistent with respect to variations in the orientation of the uniformity pattern to the measurement pattern.
We have derived a mathematical equation 1 where N3f is the estimated number of foci per cu cm wit... more We have derived a mathematical equation 1 where N3f is the estimated number of foci per cu cm with radii larger than e, and e is the lower limit for radii of foci or profiles OO that can be reliably observed in tissue sections with area (A), which provides an unbiased estimate of the number of microscopic, hepatic foci from their profiles in tissue sections. The significant feature of the formula is the recognition of the experimenter's inability to reliably identify profiles below a
Tebaldi et al. [2005] present a Bayesian approach to determining probability distribution functio... more Tebaldi et al. [2005] present a Bayesian approach to determining probability distribution functions (PDFs) of temperature change at regional scales, from the output of a multi-model ensemble, run under the same scenario of future anthropogenic emissions. The main characteristic of the method is the formalization of the two criteria of bias and convergence that the REA method [Giorgi and Mearns, 2002] first quantified as a way of assessing model reliability. Thus, the General Circulation Models (AOGCMs) of the ensemble are combined in a way that accounts for their performance with respect to current climate and a measure of each model's agreement with the majority of the ensemble. We apply the Bayesian model to a set of transient experiments under two SRES scenarios. We focus on predictions of precipitation change, for land regions of subcontinental size. We highlight differences in the PDFs of precipitation change derived in regions where models find easy agreement, and perform well in simulating present day precipitation, compared to regions where models have large biases, and/or their future projections disagree. We compare results from the two scenarios, thus assessing the consequences of the two alternative hypotheses, and present summaries based on their averaging.
Nonlinear regression is a useful statistical tool, relating observed data and a nonlinear functio... more Nonlinear regression is a useful statistical tool, relating observed data and a nonlinear function of unknown parameters. When the parameter-dependent nonlinear function is computationally intensive, a straightforward regression analysis by maximum likelihood is not feasible. The method pre-sented in this paper proposes to construct a faster running surrogate for such a computationally intensive nonlinear function, and to use it in a related non-linear statistical model that accounts for the uncertainty associated with this surrogate. A pivotal quantity in the Earth’s climate system is the climate sen-sitivity: the change in global temperature due to doubling of atmospheric CO2 concentrations. This, along with other climate parameters, are estimated by applying the statistical method developed in this paper, where the computa-tionally intensive nonlinear function is the MIT 2D climate model. 1. Introduction. A fundamental question in understanding the Earth’s cli-mate system is quan...
... BibTeX | Add To MetaCart. @MISC{Ellner92lenns,a, author = {Stephen Ellner and ... Neural Netw... more ... BibTeX | Add To MetaCart. @MISC{Ellner92lenns,a, author = {Stephen Ellner and ... Neural Networks 2(5 Hornick, Stinchombe - 1989. 18, Lyapunov exponents in chaotic systems: Their importance and their evaluation using observed data Abarbanel, Brown, et al. - 1991. ...
Journal of the American Statistical Association, 2003
Historical records of weather such as monthly precipitation and temperatures from the last centur... more Historical records of weather such as monthly precipitation and temperatures from the last century are an invaluable database to study changes and variability in climate. These data also provide the starting point for understanding and modeling the relationship among climate, ecological processes and human activities. However, these data are irregularly observed over space and time. The basic statistical problem is to create a complete data record that is consistent with the observed data and is useful to other scientific disciplines. We modify the Gaussian-Inverted Wishart spatial field model to accommodate irregular data patterns and to facilitate computations. Novel features of our implementation include the use of cross-validation to determine the relative prior weight given to the regression and geostatistical components and the use of a space filling subset to reduce the computations for some parameters. We feel the overall approach has merit, treading a line along computational feasibility and statistical validity. Furthermore, we are able to produce reliable measures of uncertainty for the estimates.
Grace Wahba (née Goldsmith, born August 3, 1934), I. J. Schoenberg-Hilldale Professor of Statisti... more Grace Wahba (née Goldsmith, born August 3, 1934), I. J. Schoenberg-Hilldale Professor of Statistics at the University of Wisconsin-Madison (Emerita), is a pioneer in methods for smoothing noisy data. Her research combines theoretical analysis, computation and methodology motivated by innovative scientific applications. Best known for the development of generalized cross-validation (GCV), the connection between splines and Bayesian posterior estimates, and "Wahba's problem," she has developed methods with applications in demographic studies, machine learning, DNA microarrays, risk modeling, medical imaging and climate prediction. Grace grew up in the Washington, DC area and New Jersey, and graduated from Montclair High School. She was educated at Cornell (B.A. 1956), University of Maryland, College Park (M.A. 1962) and Stanford (Ph.D. 1966), and worked in industry for several years before receiving her doctorate in 1966 and settling in Madison in 1967. Although holding several visiting appointments, she has made Madison her home for over 50 years. She is the author of Spline Models for Observational Data which has garnered more than 8000 citations. Grace is treasured as an academic advisor and has mentored 39 Ph.D. students that have resulted in more than 330 academic descendants.
In many practical applications, spatial data are often collected at areal levels (i.e., block dat... more In many practical applications, spatial data are often collected at areal levels (i.e., block data) and the inferences and predictions about the variable at points or blocks different from those at which it has been observed typically depend on integrals of the underlying continuous spatial process. In this paper we describe a method based on Fourier transform by which multiple integrals of covariance functions over irregular data regions may be numerically approximated with the same level of accuracy to traditional methods, but at a greatly reduced computational expense.
Journal of the Royal Statistical Society: Series B (Methodological)
... random vectors. Page 9. 1992] CHAOS IN NOISY SYSTEMS 407 ... by a set of lower dimensional fu... more ... random vectors. Page 9. 1992] CHAOS IN NOISY SYSTEMS 407 ... by a set of lower dimensional functions. Because of the typical convergence rate expected for fN, the condition MON-+O can only be satisfied if M is asymptotically negligible relative to N. This suggests that ...
The North American Regional Climate Change Assessment Program (NARCCAP) is constructing projectio... more The North American Regional Climate Change Assessment Program (NARCCAP) is constructing projections of regional climate change over the coterminous United States and Canada in order to provide climate change information at decision relevant scales. A major goal of NARCCAP is to estimate uncertainties in regional scale projections of future climate by using multiple regional climate models (RCMs) nested within multiple atmosphere-ocean general circulation models (AOGCMs). NARCCAP is using six nested regional climate models at 50 km resolution to dynamically downscale realizations of current climate (1971-2000) and future climate (2041-2070, following the A2 SRES emission scenario) from four AOGCMs. Global time slice simulations, also at 50 km resolution, will be performed using the GFDL AM2.1 and NCAR CAM3 atmospheric models forced by the AOGCM sea surface temperatures and will be compared with results of the regional models. Results from this multiple-RCM, multiple-AOGCM suite will be statistically analyzed to investigate the cascade of uncertainty as one type of model draws information from another. All output will be made available to the climate analysis and climate impacts assessment communities through an archiving and data distribution plan. The climate impacts community will have these data at unprecedented spatial and temporal (hourly to six-hourly) resolution to support decision-relevant evaluations for public policy. As part of our evaluation of uncertainties, simulations are presently being concluded that nest the participating RCMs within reanalyses of observations. These simulations can be viewed as nesting the RCMs within a GCM that is nearly perfect (constrained by available observations), allowing us to separate errors attributable to the RCMs from those attributable to the driving AOGCMs. Results to date indicate that skill is greater in winter than in summer, and greater for temperature than for precipitation. Temperature and precipitation errors are uncorrelated from model to model; consequently, the multi-model ensemble has more robust skill than any single model.
Page 363. CHAPTER 15 Local Lyapunov exponents: Predictability depends on where you are Barbara A.... more Page 363. CHAPTER 15 Local Lyapunov exponents: Predictability depends on where you are Barbara A. Bailey The dominant Lyapunov exponent of a dynamic system measures the average rate at which nearby trajectories of a system diverge. ...
ABSTRACT Extreme precipitation can cause flooding, result in substantial damages and have detrime... more ABSTRACT Extreme precipitation can cause flooding, result in substantial damages and have detrimental effects on ecosystems. Climate adaptation must therefore account for the greatest precipitation amounts that may be expected over a certain time span. The recurrence of extreme-to-heavy precipitation is notoriously hard to predict, yet cost-benefit estimates of mitigation and successful climate adaptation will need reliable information about percentiles for daily precipitation. Here we present a new and simple formula that relates wet-day mean precipitation to heavy precipitation, providing a method for predicting and downscaling daily precipitation statistics. We examined 32,857 daily rain-gauge records from around the world and the evaluation of the method demonstrated that wet-day precipitation percentiles can be predicted with high accuracy. Evaluations against independent data demonstrated high skill in both space and time, indicating a highly robust methodology.
Journal of Computational and Graphical Statistics, 2015
A multi-resolution model is developed to predict two-dimensional spatial fields based on irregula... more A multi-resolution model is developed to predict two-dimensional spatial fields based on irregularly spaced observations. The radial basis functions at each level of resolution are constructed using a Wendland compactly supported correlation function with the nodes arranged on a rectangular grid. The grid at each finer level increases by a factor of two and the basis functions are scaled to have a constant overlap. The coefficients associated with the basis functions at each level of resolution are distributed according to a Gaussian Markov random field (GMRF) and take advantage of the fact that the basis is organized as a lattice. Several numerical examples and analytical results establish that this scheme gives a good approximation to standard covariance functions such as the Matérn and also has flexibility to fit more complicated shapes. The other important feature of this model is that it can be applied to statistical inference for large spatial datasets because key matrices in the computations are sparse. The computational efficiency applies to both the evaluation of the likelihood and spatial predictions.
ContentsChapter 1Spatial Process Estimates asSmoothers1.1 IntroductionSpatial statistics has a di... more ContentsChapter 1Spatial Process Estimates asSmoothers1.1 IntroductionSpatial statistics has a diverse range of applications and refers to the class of modelsand methods for data collected over a region. This region might be a mineral fieldfor a geologist, a quadrat in a forest for an ecologist or a geographic region for anenvironmental scientist. A typical problem is to predict values of a
IEEE Transactions on Components, Packaging, and Manufacturing Technology: Part C, 1996
Within-wafer uniformity is traditionally measured by the signal-to-noise ratio (SNR), defined as ... more Within-wafer uniformity is traditionally measured by the signal-to-noise ratio (SNR), defined as the estimated standard-deviation of within-wafer measurements over the mean of those measurements. Unfortunately, in the presence of deterministic variations of the response over the wafer (such as the bull's eye effect of some processes), the SNR is sensitive to both the location and the number of the measurements taken. A robust metric for describing within-wafer uniformity is developed and compared with the SNR method. The new metric, termed the integration statistic (I) is shown to be robust to both the location and number of measurements taken on the wafer and has lower variance than the SNR metric. The implications of this robust behavior are that fewer measurements can be taken to achieve a given accuracy in the uniformity estimate and that uniformity estimates are consistent with respect to variations in the orientation of the uniformity pattern to the measurement pattern.
We have derived a mathematical equation 1 where N3f is the estimated number of foci per cu cm wit... more We have derived a mathematical equation 1 where N3f is the estimated number of foci per cu cm with radii larger than e, and e is the lower limit for radii of foci or profiles OO that can be reliably observed in tissue sections with area (A), which provides an unbiased estimate of the number of microscopic, hepatic foci from their profiles in tissue sections. The significant feature of the formula is the recognition of the experimenter's inability to reliably identify profiles below a
Uploads
Papers by Douglas Nychka