Peter Z. G. Qian
University of Wisconsin-Madison
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Peter Z. G. Qian.
Technometrics | 2008
Peter Z. G. Qian; C. F. Jeff Wu
Standard practice when analyzing data from different types of experiments is to treat data from each type separately. By borrowing strength across multiple sources, an integrated analysis can produce better results. Careful adjustments must be made to incorporate the systematic differences among various experiments. Toward this end, some Bayesian hierarchical Gaussian process models are proposed. The heterogeneity among different sources is accounted for by performing flexible location and scale adjustments. The approach tends to produce prediction closer to that from the high-accuracy experiment. The Bayesian computations are aided by the use of Markov chain Monte Carlo and sample average approximation algorithms. The proposed method is illustrated with two examples, one with detailed and approximate finite elements simulations for mechanical material design and the other with physical and computer experiments for modeling a food processor.
Technometrics | 2008
Peter Z. G. Qian; Huaiqing Wu; C. F. Jeff Wu
Modeling experiments with qualitative and quantitative factors is an important issue in computer modeling. We propose a framework for building Gaussian process models that incorporate both types of factors. The key to the development of these new models is an approach for constructing correlation functions with qualitative and quantitative factors. An iterative estimation procedure is developed for the proposed models. Modern optimization techniques are used in the estimation to ensure the validity of the constructed correlation functions. The proposed method is illustrated with an example involving a known function and a real example for modeling the thermal distribution of a data center.
Journal of the American Statistical Association | 2012
Peter Z. G. Qian
This article proposes a method for constructing a new type of space-filling design, called a sliced Latin hypercube design, intended for running computer experiments. Such a design is a special Latin hypercube design that can be partitioned into slices of smaller Latin hypercube designs. It is desirable to use the constructed designs for collective evaluations of computer models and ensembles of multiple computer models. The proposed construction method is easy to implement, capable of accommodating any number of factors, and flexible in run size. Examples are given to illustrate the method. Sampling properties of the constructed designs are examined. Numerical illustration is provided to corroborate the derived theoretical results.
Annals of Statistics | 2009
Peter Z. G. Qian; Mingyao Ai; C. F. Jeff Wu
New types of designs called nested space-filling designs have been proposed for conducting multiple computer experiments with different levels of accuracy. In this article, we develop several approaches to constructing such designs. The development of these methods also leads to the introduction of several new discrete mathematics concepts, including nested orthogonal arrays and nested difference matrices.
Technometrics | 2011
Quiang Zhou; Peter Z. G. Qian; Shiyu Zhou
We propose a flexible yet computationally efficient approach for building Gaussian process models for computer experiments with both qualitative and quantitative factors. This approach uses the hypersphere parameterization to model the correlations of the qualitative factors, thus avoiding the need of directly solving optimization problems with positive definite constraints. The effectiveness of the proposed method is successfully illustrated by several examples.
Annals of Statistics | 2011
Ben Haaland; Peter Z. G. Qian
Large-scale computer experiments are becoming increasingly important in science. A multi-step procedure is introduced to statisticians for modeling such experiments, which builds an accurate interpolator in multiple steps. In practice, the procedure shows substantial improvements in overall accuracy, but its theoretical properties are not well established. We introduce the terms nominal and numeric error and decompose the overall error of an interpolator into nominal and numeric portions. Bounds on the numeric and nominal error are developed to show theoretically that substantial gains in overall accuracy can be attained with the multi-step approach.
Technometrics | 2013
Shifeng Xiong; Peter Z. G. Qian; C. F. Jeff Wu
A growing trend in engineering and science is to use multiple computer codes with different levels of accuracy to study the same complex system. We propose a framework for sequential design and analysis of a pair of high-accuracy and low-accuracy computer codes. It first runs the two codes with a pair of nested Latin hypercube designs (NLHDs). Data from the initial experiment are used to fit a prediction model. If the accuracy of the fitted model is less than a prespecified threshold, the two codes are evaluated again with input values chosen in an elaborate fashion so that their expanded scenario sets still form a pair of NLHDs. The nested relationship between the two scenario sets makes it easier to model and calibrate the difference between the two sources. If necessary, this augmentation process can be repeated a number of times until the prediction model based on all available data has reasonable accuracy. The effectiveness of the proposed method is illustrated with several examples. Matlab codes are provided in the online supplement to this article.
ACS Nano | 2010
Fei Wang; Youngdeok Hwang; Peter Z. G. Qian; Xudong Wang
Precise control of nanomaterial morphology is critical to the development of advanced nanodevices with various functionalities. In this paper, we developed an efficient and effective statistics-guided approach to accurately characterizing the lengths, diameters, orientations, and densities of nanowires. Our approach has been successfully tested on a zinc oxide nanowire sample grown by hydrothermal methods. This approach has three key components. First, we introduced a novel geometric model to recover the true lengths and orientations of nanowires from their projective scanning electron microscope images, where a statistical resampling method is used to mitigate the practical difficulty of relocating the same sets of nanowires at multiple projecting angles. Second, we developed a sequential uniform sampling method for efficiently acquiring representative samples in characterizing diameters and growing density. Third, we proposed a statistical imputation method to incorporate the uncertainty in the determination of nanowire diameters arising from nonspherical cross-section spinning. This approach enables precise characterization of several fundamental aspects of nanowire morphology, which served as an excellent example to overcome nanoscale characterization challenges by using novel statistical means. It might open new opportunities in advancing nanotechnology and might also lead to the standardization of nanocharacterization in many aspects.
Discrete Mathematics | 2008
Rahul Mukerjee; Peter Z. G. Qian; C. F. Jeff Wu
A nested orthogonal array is an OA(N,k,s,g) which contains an OA(M,k,r,g) as a subarray. Here r
Technometrics | 2016
Youngdeok Hwang; Xu He; Peter Z. G. Qian
We propose an approach for constructing a new type of design, called a sliced orthogonal array-based Latin hypercube design. This approach exploits a slicing structure of orthogonal arrays with strength two and makes use of sliced random permutations. Such a design achieves one- and two-dimensional uniformity and can be divided into smaller Latin hypercube designs with one-dimensional uniformity. Sampling properties of the proposed designs are derived. Examples are given for illustrating the construction method and corroborating the derived theoretical results. Potential applications of the constructed designs include uncertainty quantification of computer models, computer models with qualitative and quantitative factors, cross-validation and efficient allocation of computing resources. Supplementary materials for this article are available online.