Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Theodore T. Allen is active.

Publication


Featured researches published by Theodore T. Allen.


Quality and Reliability Engineering International | 2006

Six Sigma Literature: A Review and Agenda for Future Research

James E. Brady; Theodore T. Allen

Like quality management in general, Six Sigma has penetrated into most sectors of todays business world. Although Six Sigma originated in industry, it has inspired a considerable amount of academic literature. This paper reviews this literature describing the trends, sources, and findings. The paper also seeks to synthesize the literature, with an emphasis on establishing its relationship to quality management theory and topics for future research. In doing so, there is an attempt to answer the following fundamental questions. (i) What is Six Sigma? (ii) What are its impacts on operational performance? (iii) What roles can academics usefully play in relation to Six Sigma? Copyright


International Journal of Machine Tools & Manufacture | 2000

The use of FEA and design of experiments to establish design guidelines for simple hydroformed parts

Muammer Koç; Theodore T. Allen; Suwat Jiratheranat; Taylan Altan

Abstract We present models to predict the protrusion height of “Tee-shaped” hydroformed parts, both because this information is of direct relevance to engineers attempting to build such parts and also to illustrate an advantageous process for developing design guidelines for tube hydroforming (THF) in general. A newly proposed design of experiments technique, Low Cost Response Surface Method (LCRSM), was utilized to facilitate the economical prediction and optimization of this height as a function of geometrical parameters subject to thinning of the wall thickness at the protrusion region. The same methodology is also proposed for the economical investigation of other geometries and conditions. As a result of this investigation, not only were known and expected trends of effect of parameters verified, but also numerical values within a practical range of parameters at certain conditions were obtained. In addition, interactions between factors were also revealed as predicted. Moreover, this information was gained from a substantially reduced number of finite element analysis (FEA) simulations via LCRSM compared to standard response surface method (RSM) or factorial techniques, avoiding costly physical experimentation.


Technometrics | 2003

Supersaturated Designs That Maximize the Probability of Identifying Active Factors

Theodore T. Allen; Mikhail Bernshteyn

Supersaturated designs and associated analysis methods have been proposed by several authors to identify active factors in situations in which only a very limited number of experimental runs is available. We use simulation to evaluate the abilities of the existing methods to achieve model identification–related objectives. The results motivate a new class of supersaturated designs, derived from simulation optimization, that maximize the probability that stepwise regression will identify the important main effects. Because the proposed designs depend on specific assumptions, we also investigate the sensitivity of the performances of the alternative supersaturated designs to these assumptions.


Journal of The Royal Statistical Society Series C-applied Statistics | 2003

An experimental design criterion for minimizing meta-model prediction errors applied to die casting process design

Theodore T. Allen; Liyang Yu; John Schmitz

Summary. We propose the expected integrated mean-squared error (EIMSE) experimental design criterion and show how we used it to design experiments to meet the needs of researchers in die casting engineering. This criterion expresses in a direct way the researchers’ goal to minimize the expected meta-model prediction errors, taking into account the effects of both random experimental errors and errors deriving from our uncertainty about the true model form. Because we needed to make assumptions about the prior distribution of model coefficients to estimate the EIMSE, we performed a sensitivity analysis to verify that the relative prediction performance of the design generated was largely insensitive to our assumptions. Also, we discuss briefly the general advantages of EIMSE optimal designs, including lower expected bias errors compared with popular response surface designs and substantially lower variance errors than certain Box–Draper all-bias designs.


Journal of Quality Technology | 2003

Constructing Meta-Models for Computer Experiments

Theodore T. Allen; Mikhail Bernshteyn; Khalil Kabiri-Bamoradian

We used three test functions to compare all combinations of five experimental design classes with either second-order response surface (RS) or kriging modeling methods. The findings included the following: 1) conclusions about which method performed best, even for a single case study, greatly depended on the specific experimental designs used to represent each class of designs; 2) unavoidable bias errors constituted the largest source of prediction errors when regression modeling was used with designs generated to address bias errors; and 3) estimation errors, which could be attributed to the use of the likelihood estimation objective, dominated prediction errors in kriging modeling. We tentatively conclude that, for cases in which the number of runs is comparable to the number of terms in a quadratic polynomial model, similar prediction errors can be expected from both kriging and regression modeling procedures as long as regression is used in combination with experimental designs generated to address bias errors.


Iie Transactions | 2013

The call for equity: simulation optimization models to minimize the range of waiting times

Muer Yang; Theodore T. Allen; Michael J. Fry; W. David Kelton

Providing equal access to public service resources is a fundamental goal of democratic societies. Growing research interest in public services (e.g., health care, humanitarian relief, elections) has increased the importance of considering objective functions related to equity. This article studies discrete resource allocation problems where the decision maker is concerned with maintaining equity between some defined subgroups of a customer population and where non-closed-form functions of equity are allowed. Simulation optimization techniques are used to develop rigorous algorithms to allocate resources equitably among these subgroups. The presented solutions are associated with probabilistic bounds on solution quality. A full-factorial experimental design demonstrates that the proposed algorithm outperforms competing heuristics and is robust over various inequity metrics. Additionally, the algorithm is applied to a case study of allocating voting machines to election precincts in Franklin County, Ohio. [Supplementary material is available for this article. Go to the publisher’s online edition of IIE Transactions for the Appendices to the article.]


Chance | 2006

Mitigating Voter Waiting Times

Theodore T. Allen; Mikhail Bernshteyn

M around the world are aware that lines in Franklin County, Ohio, likely deterred thousands of would-be voters in 2004. Moreover, a report commissioned by the Democratic National Committee showed that African Americans who voted waited much longer than others. Therefore, it seems likely that a disproportionate number of African Americans were deterred from voting. What caused the long lines in some voting precincts and no lines in others? How can such lines be avoided in the future? Clearly, purchasing more voting equipment will help, but how much more equipment? And how should it be allocated? In many places in the United States, the allocation of voting machines to precincts is done on the county level. Typically, county administrators have access to thousands of machines costing several thousand dollars each. Counties also pay set-up and operation costs of the machines that are comparable to the purchasing costs. In some cases, county administrators allocate the machines based on a precise formula or algorithm. In other cases, they use experience and expert judgment. In any case, they allocate the machines with the goal of avoiding long waiting times. The 2004 election put voting systems to the test because an unprecedented number of people voted. This situation placed added scrutiny on the methods for voting machine allocation and motivated the scientific study of voting machine allocation written about here. The goals of the study include establishing theoretical models relevant to relating machine allocations to waiting times; studying historical practices and their consequences using statistical theory with Franklin County, Ohio, as a case study; proposing a new method for machine allocation; and using theory and historical data to clarify potential advantages of the new method. Perhaps the most common approaches for allocating machines are based on the ratios of either the number of Waiting lines in Franklin County, Ohio, in 2004 likely deterred thousands from voting. Statistical techniques can illuminate and alleviate the problems.


Quality Engineering | 2009

Improving the Hospital Discharge Process with Six Sigma Methods

Theodore T. Allen; Shih-Hsien Tseng; Kerry Swanson; Mary Ann McClay

ABSTRACT This article describes the application of a five-phase Six Sigma define, measure, analyze, improve, and control (DMAIC) approach to streamline patient discharge at a community hospital. Within the context of the five phases, the team applied statistical process control (SPC) charting, process mapping, Pareto charting, and cause-and-effect matrices to make decisions. The findings suggested that focusing on physician preparation for discharge order writing would have the greatest impact. A significant reduction in the average discharge time from 3.3 to 2.8 h was realized (p = 0.06) and missing chart data was reduced by 62%.


Journal of Product & Brand Management | 2004

Using focus group data to set new product prices

Theodore T. Allen; Kristen M. Maybin

Describes a method for setting new product prices at time of launch using data from focus groups and illustrates its application to recommending the price of toothpaste gum for product launch. The proposed method is based on optimization of the expected utility estimated from participant responses to four simple questions. The method is designed to address uncertainties from sampling error and consumer overconfidence. In addition to the formal optimization‐based procedure, introduces plotting techniques to facilitate subjective decision making. Also, compares the proposed approach with the popular “Van Westendorp method” and demonstrates plausible conditions under which the Van Westendorp recommendations would result in prices below those that maximize expected profit.


Journal of Manufacturing Systems | 2001

A method for robust process design based on direct minimization of expected loss applied to arc welding

Theodore T. Allen; Waraporn Ittiwattana; Richard W. Richardson; Gary P. Maul

Abstract Robust process design seeks to maximize the process performance, taking into account uncertainty in the “noise” factors that cannot be controlled. A methodology for robust process design is presented that is based on direct minimization of the expected loss. The proposed methods are compared with alternatives, including methods based on Taguchis signal-to-noise ratios. Several formulations of the expected loss are explored, including formulations that account for losses from parts inside the specification limits and that permit the use of deterministic optimization methods. The method is illustrated through its application to the design of robotic gas metal arc welding (GMAW) parameter settings.

Collaboration


Dive into the Theodore T. Allen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge