Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Derek Bingham is active.

Publication


Featured researches published by Derek Bingham.


Technometrics | 1999

Minimum-aberration two-level fractional factorial split-plot designs

Derek Bingham; Randy R. Sitter

It is often impractical to perform the experimental runs of a fractional factorial in a completely random order, In these cases, restrictions on the randomization of the experimental trials are imposed and the design is said to have a split-plot structure. We rank these fractional factorial split-plot designs similarly to fractional factorials using the aberration criterion to find the minimum-aberration design. We introduce an algorithm that constructs the set of all nonisomorphic two-level fractional factorial split-plot designs more efficiently than existing methods. The algorithm can be easily modified to efficiently produce sets of all nonisomorphic fractional factorial designs, fractional factorial designs in which the number of levels is a power of a prime, and fractional factorial split-plot designs in which the number of levels is a power of a prime.


Technometrics | 2008

Sequential Experiment Design for Contour Estimation From Complex Computer Codes

Pritam Ranjan; Derek Bingham; George Michailidis

Computer simulation often is used to study complex physical and engineering processes. Although a computer simulator often can be viewed as an inexpensive way to gain insight into a system, it still can be computationally costly. Much of the recent work on the design and analysis of computer experiments has focused on scenarios where the goal is to fit a response surface or process optimization. In this article we develop a sequential methodology for estimating a contour from a complex computer code. The approach uses a stochastic process model as a surrogate for the computer simulator. The surrogate model and associated uncertainty are key components in a new criterion used to identify the computer trials aimed specifically at improving the contour estimate. The proposed approach is applied to exploration of a contour for a network queuing system. Issues related to practical implementation of the proposed approach also are addressed.


Journal of Quality Technology | 2001

Design Issues In Fractional Factorial Split-Plot Experiments

Derek Bingham; Randy R. Sitter

It is often impractical to perform the experimental runs of a fractional factorial in a completely random order. In these cases, restrictions on the randomization of the experimental trials are imposed and the design is said to have a split-plot structure. Similar to fractional factorials, the “goodness” of fractional factorial split-plot designs can be judged using the minimum aberration criterion. However, the split-plot nature of the design implies that not all factorial effects can be estimated with the same precision. In this paper, we discuss the impact of the randomization restrictions on the design. We show how the split-plot structure affects estimation, precision, and the use of resources. We demonstrate how these issues affect design selection in a real industrial experiment.


Technometrics | 2006

Variable selection for Gaussian process models in computer experiments

Crystal D. Linkletter; Derek Bingham; Nicolas W. Hengartner; David Higdon; Kenny Ye

In many situations, simulation of complex phenomena requires a large number of inputs and is computationally expensive. Identifying the inputs that most impact the system so that these factors can be further investigated can be a critical step in the scientific endeavor. In computer experiments, it is common to use a Gaussian spatial process to model the output of the simulator. In this article we introduce a new, simple method for identifying active factors in computer screening experiments. The approach is Bayesian and only requires the generation of a new inert variable in the analysis; however, in the spirit of frequentist hypothesis testing, the posterior distribution of the inert factor is used as a reference distribution against which the importance of the experimental factors can be assessed. The methodology is demonstrated on an application in material science, a computer experiment from the literature, and simulated examples.


Technometrics | 2003

Fractional Factorial Split-Plot Designs for Robust Parameter Experiments

Derek Bingham; Randy R. Sitter

For robust parameter designs, it has been noted that performing the experiment as a split plot often provides cost savings and increased efficiency. Thus experiments are often performed using fractional factorial split-plot designs. We consider how one should best choose such designs for robust parameter experiments and what is meant by maximum resolution and by minimum aberration in this context under various experimental settings.


The Annals of Applied Statistics | 2011

Efficient emulators of computer experiments using compactly supported correlation functions, with an application to cosmology

Cari G. Kaufman; Derek Bingham; Salman Habib; Katrin Heitmann; Joshua A. Frieman

Statistical emulators of computer simulators have proven to be useful in a variety of applications. The widely adopted model for emulator building, using a Gaussian process model with strictly positive correlation function, is computationally intractable when the number of simulator evaluations is large. We propose a new model that uses a combination of low-order regression terms and compactly supported correlation functions to recreate the desired predictive behavior of the emulator at a fraction of the computational cost. Following the usual approach of taking the correlation to be a product of correlations in each input dimension, we show how to impose restrictions on the ranges of the correlations, giving sparsity, while also allowing the ranges to trade off against one another, thereby giving good predictive performance. We illustrate the method using data from a computer simulator of photometric redshift with 20,000 simulator evaluations and 80,000 predictions.


Technometrics | 2006

Latin Hyperrectangle Sampling for Computer Experiments

David Mease; Derek Bingham

Latin hypercube sampling (LHS) is a popular method for evaluating the expectation of functions in computer experiments. However when the expectation of interest is taken with respect to a nonuniform distribution, the usual transformation to the probability space can cause relatively smooth functions to become extremely variable in areas of low probability. Consequently, the equal probability cells inherent in hypercube methods often tend to sample an insufficient proportion of the total points in these areas. This article introduces Latin hyperrectangle sampling (LHRS), a generalization of LHS that allows for nonequal cell probabilities, to address this problem. A number of examples are given illustrating the improvement of the proposed methodology over LHS with respect to the variance of the resulting estimators. Extensions to orthogonal array-based LHS, stratified LHS, and scrambled nets are also described.


Technometrics | 2013

Prediction and Computer Model Calibration Using Outputs From Multifidelity Simulators

Joslin Goh; Derek Bingham; James Paul Holloway; M.J. Grosskopf; C. C. Kuranz; Erica M. Rutter

Computer simulators are widely used to describe and explore physical processes. In some cases, several simulators are available, each with a different degree of fidelity, for this task. In this work, we combine field observations and model runs from deterministic multifidelity computer simulators to build a predictive model for the real process. The resulting model can be used to perform sensitivity analysis for the system, solve inverse problems, and make predictions. Our approach is Bayesian and is illustrated through a simple example, as well as a real application in predictive science at the Center for Radiative Shock Hydrodynamics at the University of Michigan. The Matlab code that is used for the analyses is available from the online supplementary materials.


Technometrics | 2007

Incorporating Prior Information in Optimal Design for Model Selection

Derek Bingham; Hugh A. Chipman

An important use of experimental designs is in screening, in which experimenters seek to identify significant effects (both main effects and potentially interactions) from a large set of candidate effects. This article goes further than identification of effects, introducing a design criterion that seeks to maximize the ability to discriminate between models. Motivated by the work of Meyer, Steinberg, and Box, the Bayesian criterion is based on the Hellinger distance between predictive distributions under competing models. A bound for the criterion is obtained, greatly improving interpretability. The set of all possible models to compare is huge, and not all models are equally plausible. This challenge is addressed through prior distributions on the space of models that indicate preference for intuitively appealing models, such as those with few effects, more low-order than high-order effects, and inheritance structure between active main effects and interactions. Techniques for evaluating the criterion and searching for optimal designs are presented. The effectiveness of the criterion is illustrated with a number of examples that consider regular and nonregular designs, robust designs, and scenarios with partial prior knowledge of which effects are significant.


Technometrics | 2008

Bayesian Inference for Multivariate Ordinal Data Using Parameter Expansion

Earl Lawrence; Derek Bingham; Chuanhai Liu; Vijayan N. Nair

Multivariate ordinal data arise in many applications. This article proposes a new, efficient method for Bayesian inference for multivariate probit models using Markov chain Monte Carlo techniques. The key idea is the novel use of parameter expansion to sample correlation matrices. A nice feature of the approach is that inference is performed using straightforward Gibbs sampling. Bayesian methods for model selection are also discussed. Our approach is motivated by a study of how women make decisions on taking medication to reduce the risk of breast cancer. Furthermore, we compare and contrast the performance of our approach with other methods.

Collaboration


Dive into the Derek Bingham's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

R. P. Drake

University of Michigan

View shared research outputs
Top Co-Authors

Avatar

Ofir Harari

Simon Fraser University

View shared research outputs
Researchain Logo
Decentralizing Knowledge