Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where W. Matthew Carlyle is active.

Publication


Featured researches published by W. Matthew Carlyle.


Interfaces | 2006

Defending Critical Infrastructure

Gerald G. Brown; W. Matthew Carlyle; Javier Salmerón; R. Kevin Wood

We apply new bilevel and trilevel optimization models to make critical infrastructure more resilient against terrorist attacks. Each model features an intelligent attacker (terrorists) and a defender (us), information transparency, and sequential actions by attacker and defender. We illustrate with examples of the US Strategic Petroleum Reserve, the US Border Patrol at Yuma, Arizona, and an electrical transmission system. We conclude by reporting insights gained from the modeling experience and many “red-team” exercises. Each exercise gathers open-source data on a real-world infrastructure system, develops an appropriate bilevel or trilevel model, and uses these to identify vulnerabilities in the system or to plan an optimal defense.


Journal of Quality Technology | 2000

Optimization problems and methods in quality control and improvement

W. Matthew Carlyle; Douglas C. Montgomery; George C. Runger

The connection between optimization methods and statistics dates back at least to the early part of the 19th century and encompasses many aspects of applied and theoretical statistics, including hypothesis testing, parameter estimation, model selection, design of experiments, and process control. This paper is an overview of some of the more frequently encountered optimization problems in statistics, with a focus on quality control and improvement. Descriptions of a variety of optimization procedures are given, including direct search methods, mathematical programming algorithms such as the generalized reduced gradient method, and heuristic approaches such as simulated annealing and genetic algorithms. We hope both to stimulate more interaction between the statistics and optimization methodology communities and to create more awareness of the important role that optimization methods play in quality control and improvement.


Computers & Operations Research | 2005

Minimizing total weighted tardiness on a single batch process machine with incompatible job families

Imelda C. Perez; John W. Fowler; W. Matthew Carlyle

The diffusion step in semiconductor wafer fabrication is very time consuming, compared to other steps in the process, and performance in this area has a significant impact on overall factory performance. Diffusion furnaces are able to process multiple lots of similar wafers at a time, and are therefore appropriately modeled as batch processing machines with incompatible job families. Due to the importance of on-time delivery in semiconductor manufacturing, we focus on minimizing the total weighted tardiness in this environment. The resulting problem is NP-Hard, and we decompose it into two sequential decision problems: assigning lots to batches followed by sequencing the batches. We develop several heuristics for these subproblems and test their performance.


Operations Research | 2005

A Two-Sided Optimization for Theater Ballistic Missile Defense

Gerald G. Brown; W. Matthew Carlyle; Douglas Diehl; Jeffrey E. Kline; R. Kevin Wood

We describe JOINT DEFENDER, a new two-sided optimization model for planning the pre-positioning of defensive missile interceptors to counter an attack threat. In our basic model, a defender pre-positions ballistic missile defense platforms to minimize the worst-case damage an attacker can achieve; we assume that the attacker will be aware of defensive pre-positioning decisions, and that both sides have complete information as to target values, attacking-missile launch sites, weapon system capabilities, etc. Other model variants investigate the value of secrecy by restricting the attackers and/or defenders access to information. For a realistic scenario, we can evaluate a completely transparent exchange in a few minutes on a laptop computer, and can plan near-optimal secret defenses in seconds. JOINT DEFENDERs mathematical foundation and its computational efficiency complement current missile-defense planning tools that use heuristics or supercomputing. The model can also provide unique insight into the value of secrecy and deception to either side. We demonstrate with two hypothetical North Korean scenarios.


Operations Research | 2009

Interdicting a Nuclear-Weapons Project

Gerald G. Brown; W. Matthew Carlyle; Robert C. Harney; Eric Skroch; R. Kevin Wood

A “proliferator” seeks to complete a first small batch of fission weapons as quickly as possible, whereas an “interdictor” wishes to delay that completion for as long as possible. We develop and solve a max-min model that identifies resource-limited interdiction actions that maximally delay completion time of the proliferators weapons project, given that the proliferator will observe any such actions and adjust his plans to minimize that time. The model incorporates a detailed project-management (critical path method) submodel, and standard optimization software solves the model in a few minutes on a personal computer. We exploit off-the-shelf project-management software to manage a database, control the optimization, and display results. Using a range of levels for interdiction effort, we analyze a published case study that models three alternate uranium-enrichment technologies. The task of “cascade loading” appears in all technologies and turns out to be an inherent fragility for the proliferator at all levels of interdiction effort. Such insights enable policy makers to quantify the effects of interdiction options at their disposal, be they diplomatic, economic, or military.


Journal of Quality Technology | 2004

Model-robust optimal designs: A genetic algorithm approach

Alejandro Heredia-Langner; Douglas C. Montgomery; W. Matthew Carlyle; Connie M. Borror

A model-robust design is an experimental array that has high efficiency with respect to a particular optimization criterion for every member of a set of candidate models that are of interest to the experimenter. We present a technique to construct model-robust alphabetically-optimal designs using genetic algorithms. The technique is useful in situations where computer-generated designs are most likely to be employed, particularly experiments with mixtures and response surface experiments in constrained regions. Examples illustrating the procedure are provided.


Operations Research | 2002

D-Optimal Sequential Experiments for Generating a Simulation-Based Cycle Time-Throughput Curve

Sungmin Park; John W. Fowler; Gerald T. Mackulak; J. Bert Keats; W. Matthew Carlyle

A cycle time-throughput curve quantifies the relationship of average cycle time to throughput rates in a manufacturing system. Moreover, it indicates the asymptotic capacity of a system. Such a curve is used to characterize system performance over a range of start rates. Simulation is a fundamental method for generating such curves since simulation can handle the complexity of real systems with acceptable precision and accuracy. A simulation-based cycle time-throughput curve requires a large amount of simulation output data; the precision and accuracy of a simulated curve may be poor if there is insufficient simulation data. To overcome these problems, sequential simulation experiments based on a nonlinear D-optimal design are suggested. Using the nonlinear shape of the curve, such a design pinpointsp starting design points, and then sequentially ranks the remainingn --p candidate design points, wheren is the total number of possible design points being considered. A model of a semiconductor wafer fabrication facility is used to validate the approach. The sequences of experimental runs generated can be used as references for simulation experimenters.


Journal of Quality Technology | 2003

Analysis of supersaturated designs

Don R. Holcomb; Douglas C. Montgomery; W. Matthew Carlyle

Supersaturated designs offer a potentially useful way to investigate many factors with very few experimental runs. These designs are used to investigate m factors with n experimental runs, where m > n−1. We evaluate several methods for analyzing a broad range of supersaturated designs and provide a basic explanation of these procedures. We show that the contrasts of a supersaturated design follow a permuted multivariate hypergeometric distribution, which may be approximated with a normal distribution. The analysis methods presented are based on methods for unreplicated fractional factorial designs. Two contrast-based analysis methods are presented, and the assumptions of the underlying model are described for a wide range of supersaturated designs.


Quality Engineering | 2004

Response Surface Modeling and Optimization in Multiresponse Experiments Using Seemingly Unrelated Regressions

Harry K. Shah; Douglas C. Montgomery; W. Matthew Carlyle

Abstract Response surface methodology (RSM) is widely used for optimizing manufacturing processes and product designs. Most applications of RSM involve several response variables. In a typical RSM study, the experimenter will build an empirical model such as the second-order model to each response and use these models to determine settings on the design variables that produce optimal or at least acceptable values for the responses. In most multiple-response RSM problems, the experimenter fits a model to each response using ordinary least squares (OLS). This article illustrates another estimation technique useful in multiple-response RSM problems, seemingly unrelated regressions (SUR). This technique can be very useful when response variables in a multiple-response RSM problem are correlated. Essentially, SUR produces more precise estimates of the model parameters than OLS when responses are correlated. This improved precision of estimation can lead to a more precise estimate of the optimum operating conditions on the process.


Decision Sciences | 2003

Quantitative Comparison of Approximate Solution Sets for Bi-criteria Optimization Problems*

W. Matthew Carlyle; John W. Fowler; Esma Senturk Gel; Bosun Kim

We present the Integrated Preference Functional (IPF) for comparing the quality of proposed sets of near-pareto-optimal solutions to bi-criteria optimization problems. Evaluating the quality of such solution sets is one of the key issues in developing and comparing heuristics for multiple objective combinatorial optimization problems. The IPF is a set functional that, given a weight density function provided by a decision maker and a discrete set of solutions for a particular problem, assigns a numerical value to that solution set. This value can be used to compare the quality of different sets of solutions, and therefore provides a robust, quantitative approach for comparing different heuristic, a posteriori solution procedures for difficult multiple objective optimization problems. We provide specific examples of decision maker preference functions and illustrate the calculation of the resulting IPF for specific solution sets and a simple family of combined objectives.

Collaboration


Dive into the W. Matthew Carlyle's collaboration.

Top Co-Authors

Avatar

Gerald G. Brown

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar

John W. Fowler

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

R. Kevin Wood

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bosun Kim

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nedialko B. Dimitrov

University of Texas at Austin

View shared research outputs
Researchain Logo
Decentralizing Knowledge