William A. Brenneman
Procter & Gamble
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by William A. Brenneman.
Technometrics | 2001
William A. Brenneman; Vijayan N. Nair
There has been considerable interest recently in the use of statistically designed experiments to identify both location and dispersion effects for quality improvement. Analysis of dispersion effects usually requires replications that can be expensive or time consuming. Several recent articles have considered identification of both location and dispersion effects from unreplicated fractional factorial experiments. In this article, we provide a systematic study of various methods that are commonly used or have been proposed recently. Both theoretical and simulation results are used to characterize the properties. Although all methods suffer from some degree of bias, some have serious problems when the bias remains large even as the design run size increases to infinity. Based on these analyses, we propose some iterative strategies for model selection and estimation of the dispersion effects. A real example and simulations as well are used to illustrate the results.
Journal of Quality Technology | 2005
William R. Myers; William A. Brenneman; Raymond H. Myers
Robust Parameter Design (RPD) has been used extensively in industrial experiments since its introduction by Genichi Taguchi. RPD has been studied and applied, in most cases, assuming a linear model under standard assumptions. More recently, RPD has been considered in a generalized linear model (GLM) setting. In this paper, we apply a general dual-response approach when using RPD in the case of a GLM. We motivate the need for exploring both the process mean and process variance by discussing situations when a compromise between the two is necessary. Several examples are provided in order to further motivate the need for applying a dual-response approach when applying RPD in the case of a GLM.
Technometrics | 2015
Shan Ba; William R. Myers; William A. Brenneman
Sliced Latin hypercube designs (SLHDs) have important applications in designing computer experiments with continuous and categorical factors. However, a randomly generated SLHD can be poor in terms of space-filling, and based on the existing construction method that generates the SLHD column by column using sliced permutation matrices, it is also difficult to search for the optimal SLHD. In this article, we develop a new construction approach that first generates the small Latin hypercube design in each slice and then arranges them together to form the SLHD. The new approach is intuitive and can be easily adapted to generate orthogonal SLHDs and orthogonal array-based SLHDs. More importantly, it enables us to develop general algorithms that can search for the optimal SLHD efficiently.
Quality and Reliability Engineering International | 2006
Timothy J. Robinson; William A. Brenneman; William R. Myers
When categorical noise variables are present in the Robust Parameter Design (RPD) context, it is possible to reduce process variance by not only manipulating the levels of the control factors but also by adjusting the proportions associated with the levels of the categorical noise factor(s). When no adjustment factors exist or when the adjustment factors are unable to bring the process mean close to target, a popular approach for determining optimal operating conditions is to find the levels of the control factors that minimize the estimated mean squared error of the response. Although this approach is effective, engineers may have a difficult time translating mean squared error into quality. We propose the use of a parts per million defective objective function. Furthermore, we point out that in many situations the levels of the control factors are not equally desirable due to cost and/or time issues. We have termed these types factors non-uniform control factors. We propose the use of desirability functions to determine optimal operating conditions when non-uniform control factors are present and illustrate this methodology with an example from industry. Copyright
Journal of Quality Technology | 2003
William A. Brenneman; William R. Myers
Robust parameter design has been extensively studied and applied in industrial experiments over the past twenty years. The purpose of robust parameter design is to design a process or product that is robust to uncontrollable changes in the noise variables. In many applications the noise variables are continuous, for which several assumptions, often difficult to verify in practice, are necessary to estimate the response variance. In this paper, we consider the case where the noise variable is categorical in nature. We discuss the impact that the assumptions for continuous and categorical noise variables have on the robust settings and on the overall process variance estimate. A designed experiment from industry is presented to illustrate the results.
Technometrics | 2012
Crystal Linkletter; Pritam Ranjan; C. Devon Lin; Derek Bingham; William A. Brenneman; Richard A. Lockhart; Thomas M. Loughin
For consumer protection, many governments perform random inspections on goods sold by weight or volume to ensure consistency between actual and labeled net contents. To pass inspection, random samples must jointly comply with restrictions placed on the individual sampled items and on the sample average. In this article, we consider the current United States National Institute of Standards and Technology joint acceptance criteria. Motivated by a problem from a real manufacturing process, we provide an approximation for the probability of sample acceptance that is applicable for processes with one or more known sources of variation via a random effects model. This approach also allows the assessment of the sampling scheme of the items. We use examples and simulations to assess the quality and accuracy of the approximation and illustrate how the methodology can be used to fine-tune process parameters for a prespecified probability of sample acceptance. Simulations are also used for estimating variance components.
Technometrics | 2011
Lulu Kang; V. Roshan Joseph; William A. Brenneman
In mixture-of-mixtures experiments major components are defined as the components which themselves are mixtures of some other components, called minor components. Sometimes components are divided into different categories where each category is called a major component and the components within a major component become minor components. The special structure of the mixture-of-mixtures experiment makes the design and modeling approaches different from a typical mixture experiment. In this article we propose a new model called the major–minor model to overcome some of the limitations of the commonly used multiple-Scheffé model. We also provide a strategy for designing experiments that are much smaller in size than those based on the existing methods. We then apply the proposed design and modeling approach to a mixture-of-mixtures experiment conducted to formulate a new potato crisp. This article has supplementary material online.
Infection Control and Hospital Epidemiology | 2013
Joseph A. Fortuna; William A. Brenneman; Sandra Storli; David Birnbaum; Kay L. Brown
We believe that the current practice in HAI reporting of using estimation approaches rather than quality control approaches to data supply chain validation might be supportable for research. However, we believe that it should not be the standard recommended practice for program managers who need to ensure that their data supply chain produces data of sustainably reliable quality over time to ensure maximum protection of the public’s health. Therefore, we strongly recommend an immediate, synergistic alignment of the state-of-the-art validation acceptance-sampling methods and science currently used widely in industry with the needs and logistics of the tasks involved in collecting, sampling, evaluating, and reporting HAI data. We believe that the American public deserves to have its best technology and its “A Team” on the field in this effort and that involving the ASQ, its Healthcare Division, its quality scientists, and its intellectual capital resources could dramatically improve the HAI quality reporting system. Our people and our healthcare system deserve no less!
Quality Engineering | 2012
William A. Brenneman; Michael D. Joner
ABSTRACT A high-level business need was addressed via the development of a solution for setting appropriate targets for product filling processes. This required the calculation of the probability of meeting corporate and regulatory requirements under more realistic assumptions. Throughout the process both technical and nontechnical approaches were required. The patented solution is now embedded in the production processes worldwide through a Web-based application that helps the company save time, resources, and money.
Technometrics | 2014
William A. Brenneman
The commentator discusses in general the importance of simulation studies and how they can solve very difficult problems quickly.