Mary E. Kurz
Clemson University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mary E. Kurz.
European Journal of Operational Research | 2004
Mary E. Kurz; Ronald G. Askin
Abstract This paper examines scheduling in flexible flow lines with sequence-dependent setup times to minimize makespan. This type of manufacturing environment is found in industries such as printed circuit board and automobile manufacture. An integer program that incorporates these aspects of the problem is formulated and discussed. Because of the difficulty in solving the IP directly, several heuristics are developed, based on greedy methods, flow line methods, the Insertion Heuristic for the Traveling Salesman Problem and the Random Keys Genetic Algorithm. Problem data is generated in order to evaluate the heuristics. The characteristics are chosen to reflect those used by previous researchers. A lower bound has been created in order to evaluate the heuristics, and is itself evaluated. An application of the Random Keys Genetic Algorithm is found to be very effective for the problems examined. Conclusions are then drawn and areas for future research are identified.
International Journal of Production Economics | 2003
Mary E. Kurz; Ronald G. Askin
Abstract This paper explores scheduling flexible flow lines with sequence-dependent setup times. Three major types of heuristics are explored. Insertion heuristics (based on insertion heuristics for the traveling salesman problem) attempt to simultaneously equalize workload on all processors at a stage, and minimize total or single-stage flowtimes. Johnsons algorithm for two-stage flow shops and its heuristic extensions to m-machine flow shops are modified for parallel processors and the flexible flow-line environment. A set of naive greedy heuristics is investigated for comparison purposes. The performance of the heuristics is compared on a set of test problems. Results indicate the range of conditions under which each method performs well.
International Journal of Production Research | 2012
Eray Cakici; Scott J. Mason; Mary E. Kurz
We study the problem of minimising the total weighted tardiness and total distribution costs in an integrated production and distribution environment. Orders are received by a manufacturer, processed on a single production line, and delivered to customers by capacitated vehicles. Each order (job) is associated with a customer, weight (priority), processing time, due time, and size (volume or storage space required in the transportation unit). A mathematical model is presented in which a number of weighted linear combinations of the objectives are used to aggregate both objectives into a single objective. Because even the single objective problem is NP-hard, different heuristics based on a genetic algorithm (GA) are developed to further approximate a Pareto-optimal set of solutions for our multi-objective problem.
International Journal of Production Research | 2008
Mary E. Kurz; Scott J. Mason
Semiconductor wafer fabrication facilities (“wafer fabs”) strive to maximize on-time delivery performance for customer orders. Effectively scheduling jobs on critical or bottleneck equipment in the wafer fab can promote on-time deliveries. One type of critical fab equipment is a diffusion oven which processes multiple wafer lots (jobs) simultaneously in batches. We present a new polynomial time Batch Improvement Algorithm for scheduling a batch-processing machine to maximize on-time delivery performance (minimize total weighted tardiness) when job arrivals are dynamic. The proposed algorithms performance is compared to previous research efforts under varying problem conditions. Experimental studies demonstrate the effectiveness of the Batch Improvement Algorithm.
IIE Transactions on Healthcare Systems Engineering | 2011
Sunarin Chanta; Maria E. Mayorga; Mary E. Kurz; Laura A. McLay
Equity is an important consideration in public services such as Emergency Medical Service (EMS) systems. In such systems not only equitability but also performance depends on the spatial distribution of facilities and resources. This paper proposes the minimum p-envy facility location model which aims to find optimal locations for facilities in order to balance customers’ perceptions of equity in receiving service. The model is developed and evaluated through the lens of EMS systems, where ambulances are located at facilities (stations) with the objective of minimizing the sum of “envy” among all demand zones (customer points) with respect to an ordered set of p operating stations weighted by the proportion of demand in each zone. The problem is formulated as an integer program, with priority weights assigned according the probability that an ambulance is available, which is estimated using the hypercube model. Because of the computational effort required to obtain solutions using commercially available software, a tabu search is developed to solve the problem. A case study using real-world data is presented. The performance of the proposed model is tested and compared to other location models such as the p-center and maximal-covering-location problems (MCLP).
Computers & Industrial Engineering | 2014
AliReza Madadi; Mary E. Kurz; Kevin Taaffe; Julia L. Sharp; Scott J. Mason
We focus on supply disruptions that result in producing tainted materials.We design a supply network to prevent risk of sending tainted material to customer.Statistical analysis was conducted to identify factors for predicting facility selection.We consider new idea of facility inspection as a recent FDA requirement in some SC.Presented model enables practitioners to select the most qualified suppliers. In this paper, we investigate a supply network design in supply chain with unreliable supply with application in the pharmaceutical industry. We consider two types of decision making policies: (1) a risk-neutral decision-making policy that is based on a cost-minimization approach and (2) a risk-averse policy wherein, rather than selecting facilities and identifying the pertinent supplier-consumer assignments that minimize the expected cost, the decision-maker uses a Conditional Value-at-Risk (CVaR) approach to measure and quantify risk and to define what comprises a worst-case scenario. The CVaR methodology allows the decision-maker to specify to what extent worst-case scenarios should be avoided and the corresponding costs associated with such a policy. After introducing the underlying optimization models, we present computational analysis and statistical analysis to compare the results of the risk-averse and risk-neutral policies. In addition, we provide several managerial insights.
scandinavian conference on information systems | 2007
Scott J. Mason; Mary E. Kurz; Michele E. Pfund; John W. Fowler; Letitia M. Pohl
We examine a complex, multi-objective semiconductor manufacturing scheduling problem involving two batch processing steps linked by a timer constraint. This constraint requires that any job completing the first processing step must be started on the succeeding second machine within some allowable time window; otherwise, the job must repeat its processing on the first step. We present a random keys implementation of NSGA-II (nondominated sorting genetic algorithm) for our problem of interest and investigate the efficacy of different batching policies in terms of the number of approximate efficient solutions that are produced by NSGA-II over a wide range of experimental problem instances. Experimental results suggest a full batch policy can produce superior solutions as compared to greedy batching policies under the experimental conditions examined
Computers & Operations Research | 2010
Mark H. McElreath; Maria E. Mayorga; Mary E. Kurz
The assortment planning problem involves choosing an optimal product line, as defined by a set of products with specific attributes, to offer consumers. Under a locational choice model in which products are differentiated both horizontally (by variety attributes) and vertically (by quality attributes), an optimal assortment, whose attributes have only been partially characterized, may consist of multiple quality levels. Using previous analytical results, we approximate the optimal assortment for make-to-order and static substitution environments. We test the appropriateness and compare the performance of three metaheuristic methods. These metaheuristics can easily be modified to accommodate different consumer preference distribution assumptions.
International Journal of Production Research | 2012
Funda Samanlioglu; William G. Ferrell; Mary E. Kurz
In this paper, a preference-based, interactive memetic random-key genetic algorithm (PIMRKGA) is developed and used to find (weakly) Pareto optimal solutions to manufacturing and production problems that can be modelled as a symmetric multi-objective travelling salesman problem. Since there are a large number of solutions to these kinds of problems, to reduce the computational effort and to provide more desirable and meaningful solutions to the decision maker, this research focuses on using interactive input from the user to explore the most desirable parts of the efficient frontier instead of trying to reproduce the entire frontier. Here, users define their preferences by selecting among five classes of objective functions and by specifying weighting coefficients, bounds, and optional upper bounds on indifference tradeoffs. This structure is married with the memetic algorithm – a random-key genetic algorithm hybridised by local search. The resulting methodology is an iterative process that continues until the decision maker is satisfied with the solution. The paper concludes with case studies utilising different scenarios to illustrate possible manufacturing and production related implementations of the methodology.
ASME 2015 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2015 | 2015
Keith Phelan; Crystal Wilson; Joshua D. Summers; Mary E. Kurz
The purpose of this research is to conduct a user study in order to determine the effect of numerous variables for data representation on the ability to answer questions about the system being represented. This research will be used in the development of a computer-based visualization tool to support configuration change management. The researchers hypothesized that the graph geometry and order of the questions being asked would not affect the results, while the color of the graph and the information being represented would affect the number of correct responses. The results showed an increase in the response accuracy for the answerable questions when the amount of information displayed in the data representation was minimized. On the other hand, none of the other factors showed to have a significant effect on the accuracy of the responses. The most significant limitation in this study was the possibility for different users putting different levels of effort into answering the questions.Copyright