Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dave Higdon is active.

Publication


Featured researches published by Dave Higdon.


computational science and engineering | 2005

Combining Field Data and Computer Simulations for Calibration and Prediction

Dave Higdon; Marc C. Kennedy; James C. Cavendish; John A. Cafeo; Robert D. Ryne

We develop a statistical approach for characterizing uncertainty in predictions that are made with the aid of a computer simulation model. Typically, the computer simulation code models a physical system and requires a set of inputs---some known and specified, others unknown. A limited amount of field data from the true physical system is available to inform us about the unknown inputs and also to inform us about the uncertainty that is associated with a simulation-based prediction. The approach given here allows for the following: uncertainty regarding model inputs (i.e., calibration); accounting for uncertainty due to limitations on the number of simulations that can be carried out; discrepancy between the simulation code and the actual physical system; uncertainty in the observation process that yields the actual field data on the true physical system. The resulting analysis yields predictions and their associated uncertainties while accounting for multiple sources of uncertainty. We use a Bayesian formulation and rely on Gaussian process models to model unknown functions of the model inputs. The estimation is carried out using a Markov chain Monte Carlo method. This methodology is applied to two examples: a charged particle accelerator and a spot welding process.


IEEE Transactions on Signal Processing | 2002

A Bayesian approach to characterizing uncertainty in inverse problems using coarse and fine-scale information

Dave Higdon; Herbert K. H. Lee; Zhuoxin Bi

The Bayesian approach allows one to easily quantify uncertainty, at least in theory. In practice, however, the Markov chain Monte Carlo (MCMC) method can be computationally expensive, particularly in complicated inverse problems. We present a methodology for improving the speed and efficiency of an MCMC analysis by combining runs on different scales. By using a coarser scale, the chain can run faster (particularly when there is an external forward simulator involved in the likelihood evaluation) and better explore the posterior, being less likely to become stuck in local maxima. We discuss methods for linking the coarse chain back to the original fine-scale chain of interest. The resulting coupled chain can thus be run more efficiently without sacrificing the accuracy achieved at the finer scale.


Statistical Modelling | 2005

Efficient models for correlated data via convolutions of intrinsic processes

Herbert K. H. Lee; Dave Higdon; Catherine A. Calder; Christopher H. Holloman

Gaussian processes (GP) have proven to be useful and versatile stochastic models in a wide variety of applications including computer experiments, environmental monitoring, hydrology and climate modeling. A GP model is determined by its mean and covariance functions. In most cases, the mean is specified to be a constant, or some other simple linear function, whereas the covariance function is governed by a few parameters. A Bayesian formulation is attractive as it allows for formal incorporation of uncertainty regarding the parameters governing the GP. However, estimation of these parameters can be problematic. Large datasets, posterior correlation and inverse problems can all lead to difficulties in exploring the posterior distribution. Here, we propose an alternative model which is quite tractable computationally - even with large datasets or indirectly observed data - while still maintaining the flexibility and adaptiveness of traditional GP models. This model is based on convolving simple Markov random fields with a smoothing kernel. We consider applications in hydrology and aircraft prototype testing.


Physical Review Letters | 2015

Uncertainty quantification for nuclear density functional theory and information content of new measurements

J. McDonnell; Nicolas Schunck; Dave Higdon; Jason Sarich; Stefan M. Wild; W. Nazarewicz

Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.


Journal of Computational and Graphical Statistics | 2006

Multiresolution Genetic Algorithms and Markov chain Monte Carlo

Christopher H. Holloman; Herbert K. H. Lee; Dave Higdon

This article proposes a multiresolution genetic algorithm that allows efficient estimation of parameters in large-dimensional models. Such models typically rely on complex numerical methods that require large amounts of computing power for estimating parameters. Unfortunately, the numerical maximization and sampling techniques used to fit such complex models often explore the parameter space slowly resulting in unreliable estimates. Our algorithm improves this exploration by incorporating elements of simulated tempering into a genetic algorithm framework for maximization. Our algorithm can also be adapted to perform Markov chain Monte Carlo sampling from a posterior distribution in a Bayesian setting, which can greatly improve mixing and exploration of the posterior compared to ordinary MCMC methods. The proposed algorithm can be used to estimate parameters in any model where the solution can be solved on different scales, even if the data are not inherently multiscale. We address parallel implementation of the algorithms and demonstrate their use on examples from single photon emission computed tomography and groundwater hydrology.


Journal of Physics G | 2015

Error analysis in nuclear density functional theory

Nicolas Schunck; J. McDonnell; Jason Sarich; Stefan M. Wild; Dave Higdon

Nuclear density functional theory (DFT) is the only microscopic, global approach to the structure of atomic nuclei. It is used in numerous applications, from determining the limits of stability to gaining a deep understanding of the formation of elements in the universe or the mechanisms that power stars and reactors. The predictive power of the theory depends on the amount of physics embedded in the energy density functional as well as on efficient ways to determine a small number of free parameters and solve the DFT equations. In this article, we discuss the various sources of uncertainties and errors encountered in DFT and possible methods to quantify these uncertainties in a rigorous manner.


Journal of Physics G | 2015

A Bayesian approach for parameter estimation and prediction using a computationally intensive model

Dave Higdon; J. McDonnell; Nicolas Schunck; Jason Sarich; Stefan M. Wild

Bayesian methods have been very successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model


European Physical Journal A | 2015

Uncertainty quantification and propagation in nuclear density functional theory

Nicolas Schunck; J. McDonnell; Dave Higdon; Jason Sarich; Stefan M. Wild

\eta(\theta)


Bayesian Analysis | 2010

Predicting Vertical Connectivity Within an Aquifer System

Margaret B. Short; Dave Higdon; Laura Guadagnini; Alberto Guadagnini; Daniel M. Tartakovsky

where


Quality Engineering | 2015

Illustrating How Science Can Be Incorporated into a Nonlinear Regression Model

Michael S. Hamada; Dave Higdon; Jeff I. Abes; Charles R. Hills; A. M. Peters

\theta

Collaboration


Dive into the Dave Higdon's collaboration.

Top Co-Authors

Avatar

J. McDonnell

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Jason Sarich

Argonne National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Nicolas Schunck

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Stefan M. Wild

Argonne National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Brian J. Williams

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Salman Habib

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Cetin Unal

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James R. Gattiker

Los Alamos National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge