Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Connie M. Borror is active.

Publication


Featured researches published by Connie M. Borror.


Journal of Quality Technology | 2004

Response Surface Methodology: A Retrospective and Literature Survey

Raymond H. Myers; Douglas C. Montgomery; G. Geoffrey Vining; Connie M. Borror; Scott M. Kowalski

Response surface methodology (RSM) is a collection of statistical design and numerical optimization techniques used to optimize processes and product designs. The original work in this area dates from the 1950s and has been widely used, especially in the chemical and process industries. The last 15 years have seen the widespread application of RSM and many new developments. In this review paper we focus on RSM activities since 1989. We discuss current areas of research and mention some areas for future research.


Journal of Quality Technology | 2003

A Review of Methods for Measurement Systems Capability Analysis

Richard K. Burdick; Connie M. Borror; Douglas C. Montgomery

We review methods for conducting and analyzing measurement systems capability studies, focusing on the analysis of variance approach. These studies are designed experiments involving crossed and possibly nested factors. The analysis of variance is an attractive method for analyzing the results of these experiments because it permits efficient point and interval estimation of the variance components associated with the sources of variability in the experiment. In this paper we demonstrate computations for the standard two-factor design, describe aspects of designing the experiment, and provide references for situations where the standard two-factor design is not applicable.


Journal of Quality Technology | 1999

ROBUSTNESS OF THE EWMA CONTROL CHART TO NON-NORMALITY

Connie M. Borror; Douglas C. Montgomery; George C. Runger

Rational subgroups of size n = 1 are frequently encountered in process monitoring and control. The Shewhart control chart for individuals is often used in these situations. It is well-known that the in-control average run length (ARL) of this chart is 3..


IEEE Transactions on Reliability | 2004

Robustness of the Markov-chain model for cyber-attack detection

Nong Ye; Yebin Zhang; Connie M. Borror

Cyber-attack detection is used to identify cyber-attacks while they are acting on a computer and network system to compromise the security (e.g., availability, integrity, and confidentiality) of the system. This paper presents a cyber-attack detection technique through anomaly-detection, and discusses the robustness of the modeling technique employed. In this technique, a Markov-chain model represents a profile of computer-event transitions in a normal/usual operating condition of a computer and network system (a norm profile). The Markov-chain model of the norm profile is generated from historic data of the systems normal activities. The observed activities of the system are analyzed to infer the probability that the Markov-chain model of the norm profile supports the observed activities. The lower probability the observed activities receive from the Markov-chain model of the norm profile, the more likely the observed activities are anomalies resulting from cyber-attacks, and vice versa. This paper presents the learning and inference algorithms of this anomaly-detection technique based on the Markov-chain model of a norm profile, and examines its performance using the audit data of UNIX-based host machines with the Solaris operating system. The robustness of the Markov-chain model for cyber-attack detection is presented through discussions & applications. To apply the Markov-chain technique and other stochastic process techniques to model the sequential ordering of events, the quality of activity-data plays an important role in the performance of intrusion detection. The Markov-chain technique is not robust to noise in the data (the mixture level of normal activities and intrusive activities). The Markov-chain technique produces desirable performance only at a low noise level. This study also shows that the performance of the Markov-chain techniques is not always robust to the window size: as the window size increases, the amount of noise in the window also generally increases. Overall, this study provides some support for the idea that the Markov-chain technique might not be as robust as the other intrusion-detection methods such as the chi-square distance test technique , although it can produce better performance than the chi-square distance test technique when the noise level of the data is low, such as the Mill & Pascal data in this study.


Archive | 2005

Design and analysis of gauge R&R studies : making decisions with confidence intervals in random and mixed ANOVA models

Richard K. Burdick; Connie M. Borror; Douglas C. Montgomery

Preface 1. Introduction 2. Balanced One-Factor Random Models 3. Balanced Two-Factor Crossed Random Models with Interaction 4. Design of Gauge R&R Experiments 5. Balanced Two-Factor Crossed Random Models with No Interaction 6. Balanced Two-Factor Crossed Mixed Models 7. Unbalanced One- and Two-Factor Models 8. Strategies for Constructing Intervals with ANOVA Models Appendix A. The Analysis of Variance Appendix B. MLS and GCI Methods Appendix C. Tables of F-values Bibliography Index.


Journal of Quality Technology | 1998

Poisson EWMA Control Charts

Connie M. Borror; Charles W. Champ; Steven E. Rigdon

An exponentially weighted moving average control chart for monitoring Poisson data is introduced. The charting procedure is evaluated using a Markov chain approximation, and its average run length is compared to other procedures for Poisson data. Figure..


Journal of Quality Technology | 2003

Genetic Algorithms for the Construction of D-Optimal Designs

Alejandro Heredia-Langner; W M. Carlyle; Douglas C. Montgomery; Connie M. Borror; George C. Runger

Computer-generated designs are useful for situations where standard factorial, fractional factorial or response surface designs cannot be easily employed. Alphabetically-optimal designs are the most widely used type of computer-generated designs, and of these, the D-optimal (or D-efficient) class of designs is extremely popular. D-optimal designs are usually constructed by algorithms that sequentially add and delete points from a potential design by using a candidate set of points spaced over the region of interest. We present a technique to generate D-efficient designs using genetic algorithms (GA). This approach eliminates the need to explicitly consider a candidate set of experimental points and it can be used in highly constrained regions while maintaining a level of performance comparable to more traditional design construction techniques.


Journal of Quality Technology | 2004

Model-robust optimal designs: A genetic algorithm approach

Alejandro Heredia-Langner; Douglas C. Montgomery; W. Matthew Carlyle; Connie M. Borror

A model-robust design is an experimental array that has high efficiency with respect to a particular optimization criterion for every member of a set of candidate models that are of interest to the experimenter. We present a technique to construct model-robust alphabetically-optimal designs using genetic algorithms. The technique is useful in situations where computer-generated designs are most likely to be employed, particularly experiments with mixtures and response surface experiments in constrained regions. Examples illustrating the procedure are provided.


Journal of Quality Technology | 2002

Evaluation of Statistical Designs for Experiments Involving Noise Variables

Connie M. Borror; Douglas C. Montgomery; Raymond H. Myers

In process robustness studies, it is desirable to simultaneously minimize the influence of noise factors on the system and to determine the levels of controllable factors that will optimize the overall response or outcome. A methodology for evaluating designed experiments that involve both controllable and uncontrollable, or noise, factors is outlined and presented in this paper. Two variance expressions are developed for evaluating competing experimental design strategies. The maximum, average, and minimum scaled prediction error variances resulting from the models developed are displayed visually on variance dispersion graphs. The scaled prediction error variances account for mean model errors as well as variation transmitted to the process by noise variables.


International Journal of Production Research | 2003

Robustness of the time between events CUSUM

Connie M. Borror; J. Bert Keats; Douglas C. Montgomery

Many companies have set Parts per Million (PPM) or Parts per Million Opportunities (PPMO) goals in their quest for continuous improvement. The time-between-events (TBE) CUSUM has been suggested for monitoring the number of good units or the number of opportunities that occurred between discoveries of consecutive bad units. We focus on the robustness of the TBE CUSUM. Robustness, in this case, refers to sensitivity of the procedure to make the proper decisions regarding a shift in the mean defect rate when, in fact, the time between events is not exponential. We examine and report average run lengths (ARLs) under both a Weibull and a lognormal time between events distribution. Our results indicate that the TBE CUSUM is extremely robust for a wide variety of parameter values for both the Weibull and lognormal distributions. The implications of these results in practice imply that users of the TBE CUSUM procedure need not be concerned about departures from the exponential TBE distribution. Practical implementation of the TBE CUSUM procedure is also discussed.

Collaboration


Dive into the Connie M. Borror's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rong Pan

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Nong Ye

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge