Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brian J. Williams is active.

Publication


Featured researches published by Brian J. Williams.


Reliability Engineering & System Safety | 2006

Sensitivity analysis when model outputs are functions

Katherine Campbell; Michael D. McKay; Brian J. Williams

When outputs of computational models are time series or functions of other continuous variables like distance, angle, etc., it can be that primary interest is in the general pattern or structure of the curve. In these cases, model sensitivity and uncertainty analysis focuses on the effect of model input choices and uncertainties in the overall shapes of such curves. We explore methods for characterizing a set of functions generated by a series of model runs for the purpose of exploring relationships between these functions and the model inputs.


Physical Review D | 2007

Cosmic calibration : Constraints from the matter power spectrum and the cosmic microwave background

Salman Habib; Katrin Heitmann; David Higdon; Charles Nakhleh; Brian J. Williams

Several cosmological measurements have attained significant levels of maturity and accuracy over the past decade. Continuing this trend, future observations promise measurements of the cosmic mass distribution at an accuracy level of 1% out to spatial scales with


Reliability Engineering & System Safety | 2011

Batch sequential design to achieve predictive maturity with calibrated computer models

Brian J. Williams; Jason L. Loeppky; Leslie M. Moore; Mason S. Macklem

k\ensuremath{\sim}10h\text{ }\text{ }{\mathrm{Mpc}}^{\ensuremath{-}1}


Mechanics of Advanced Materials and Structures | 2015

A Resource Allocation Framework for Experiment-Based Validation of Numerical Models

Sez Atamturktur; Joshua Hegenderfer; Brian J. Williams; Matthew C. Egeberg; R. A. Lebensohn; Cetin Unal

and even smaller, entering highly nonlinear regimes of gravitational instability. In order to interpret these observations and extract useful cosmological information from them, such as the equation of state of dark energy, very costly high precision, multiphysics simulations must be performed. We have recently implemented a new statistical framework with the aim of obtaining accurate parameter constraints from combining observations with a limited number of simulations. The key idea is the replacement of the full simulator by a fast emulator with controlled error bounds. In this paper, we provide a detailed description of the methodology and extend the framework to include joint analysis of cosmic microwave background and large-scale structure measurements. Our framework is especially well suited for upcoming large-scale structure probes of dark energy such as baryon acoustic oscillations and, especially, weak lensing, where percent level accuracy on nonlinear scales is needed.


Archive | 2010

Sequential Design of Computer Experiments for Constrained Optimization

Brian J. Williams; Thomas J. Santner; William I. Notz; Jeffrey S. Lehman

Sequential experiment design strategies have been proposed for efficiently augmenting initial designs to solve many problems of interest to computer experimenters, including optimization, contour and threshold estimation, and global prediction. We focus on batch sequential design strategies for achieving maturity in global prediction of discrepancy inferred from computer model calibration. Predictive maturity focuses on adding field experiments to efficiently improve discrepancy inference. Several design criteria are extended to allow batch augmentation, including integrated and maximum mean square error, maximum entropy, and two expected improvement criteria. In addition, batch versions of maximin distance and weighted distance criteria are developed. Two batch optimization algorithms are considered: modified Fedorov exchange and a binning methodology motivated by optimizing augmented fractional factorial skeleton designs.


Technometrics | 2010

Incorporating Covariates in Flowgraph Models: Applications to Recurrent Event Data

Aparna V. Huzurbazar; Brian J. Williams

In experiment-based validation, uncertainties and systematic biases in model predictions are reduced by either increasing the amount of experimental evidence available for model calibration—thereby mitigating prediction uncertainty—or increasing the rigor in the definition of physics and/or engineering principles—thereby mitigating prediction bias. Hence, decision makers must regularly choose between either allocating resources for experimentation or further code development. The authors propose a decision-making framework to assist in resource allocation strictly from the perspective of predictive maturity and demonstrate the application of this framework on a nontrivial problem of predicting the plastic deformation of polycrystals.


Journal of statistical theory and practice | 2011

Follow-up experimental designs for computer models and physical processes

Pritam Ranjan; Wilson W. Lu; Derek Bingham; Shane Reese; Brian J. Williams; Chuan Chih Chou; Forrest Doss; M.J. Grosskopf; James Paul Holloway

This paper proposes a sequential method of designing computer or physical experiments when the goal is to optimize one integrated signal function subject to constraints on the integral of a second response function. Such problems occur, for example, in industrial problems where the computed responses depend on two types of inputs: manufacturing variables and noise variables. In industrial settings, manufacturing variables are determined by the product designer; noise variables represent field conditions which are modeled by specifying a probability distribution for these variables. The update scheme of the proposed method selects the control portion of the next input site to maximize a posterior expected “improvement” and the environmental portion of this next input is selected to minimize the mean square prediction error of the objective function at the new control site. The method allows for dependence between the objective and constraint functions. The efficacy of the algorithm relative to the single-stage design and relative to a design assuming independent responses is illustrated. Implementation issues for the deterministic and measurement error cases are discussed as are some generalizations of the method.


Journal of Climate | 2016

Quantitative Sensitivity Analysis of Physical Parameterizations for Cases of Deep Convection in the NASA GEOS-5

Derek J. Posselt; Bruce Fryxell; Andrea Molod; Brian J. Williams

Modeling recurrent event data is of current interest in statistics and engineering. This article proposes a framework for incorporating covariates in flowgraph models, with application to recurrent event data in systems reliability settings. A flowgraph is a generalized transition graph (GTG) originally developed to model total system waiting times for semi-Markov processes. The focus of flowgraph models is expanded by linking covariates into branch transition models, enriching the toolkit of available data analysis methods for complex stochastic systems. This article takes a Bayesian approach to the analysis of flowgraph models. Potential applications are not limited to engineering systems, but also extend to survival analysis.


The American Statistician | 2013

Rare Event Estimation for Computer Models

Rick Picard; Brian J. Williams

In many branches of physical science, when the complex physical phenomena are either too expensive or too time consuming to observe, deterministic computer codes are often used to simulate these processes. Nonetheless, true physical processes are also observed in some disciplines. It is preferred to integrate both the true physical process and the computer model data for better understanding of the underlying phenomena. In this paper, we develop a methodology for selecting optimal follow-up designs based on integrated mean squared error that help us capture and reduce prediction uncertainty as much as possible. We also compare the efficiency of the optimal designs with the intuitive choices for the follow-up computer and field trials.


Technometrics | 2013

Global Sensitivity Analysis for Mixture Experiments

Jason L. Loeppky; Brian J. Williams; Leslie M. Moore

AbstractParameterization of processes that occur on length scales too small to resolve on a computational grid is a major source of uncertainty in global climate models. This study investigates the relative importance of a number of parameters used in the Goddard Earth Observing System Model, version 5 (GEOS-5), atmospheric general circulation model, focusing on cloud, convection, and boundary layer parameterizations. Latin hypercube sampling is used to generate a few hundred sets of 19 candidate physics parameters, which are subsequently used to generate ensembles of single-column model realizations of cloud content, precipitation, and radiative fluxes for four different field campaigns. A Gaussian process model is then used to create a computationally inexpensive emulator for the simulation code that can be used to determine a measure of relative parameter sensitivity by sampling the response surface for a very large number of input parameter sets. Parameter sensitivities are computed for different geog...

Collaboration


Dive into the Brian J. Williams's collaboration.

Top Co-Authors

Avatar

Cetin Unal

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Jason L. Loeppky

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

François M. Hemez

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aparna V. Huzurbazar

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Christopher J. Stull

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Dave Higdon

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

David Higdon

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Leslie M. Moore

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Brian Weaver

Los Alamos National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge