Kenneth R. Chelst
Wayne State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kenneth R. Chelst.
Operations Research | 1998
Jonathan P. Caulkins; Edward H. Kaplan; Peter Lurie; Thomas O'Connor; Sung-Ho Ahn; Kenneth R. Chelst; Sampath Rajagopalan
Businesses frequently have to decide which of their existing equipment to replace, taking into account future changes in capacity requirements. The significance of this decision becomes clear when one notes that expenditure on new plant and equipment is a significant proportion of the GDP in the United States. The equipment replacement literature has focused on the replacement issue, usually ignoring aspects such as future demand changes and economies of scale. On the other hand, the capacity expansion literature has focused on the expansion of equipment capacity to meet demand growth, considering economies of scale but ignoring the replacement aspect. This paper attempts to unify the two streams of research by developing a general model that considers replacement of capacity as well as expansion and disposal, together with scale economy effects. Even special cases of the problems discussed here, such as the parallel machine replacement problem, have been considered difficult so far. However, we show that the problem can be solved efficiently by formulating it in a novel, disaggregate manner and using a dual-based solution procedure that exploits the structure of the problem. We also provide computational results to affirm that optimal or near-optimal solutions to large, realistic problems can be determined efficiently. We demonstrate the robustness of this approach by showing how other realistic features such as quantity discounts in purchases, alternative technology types or suppliers, and multiple equipment types can be incorporated.
IEEE Transactions on Engineering Management | 1999
Ali A. Yassine; Kenneth R. Chelst; Donald R. Falkenburg
This paper quantifies key issues with regard to concurrent engineering through the use of risk and decision analysis techniques that enable us to better understand, structure, and manage the design process. In concurrent engineering, the information structure of a design process does not usually imply the execution patterns of the corresponding design tasks. On the contrary, this gap between the information structure and execution patterns is the essence of concurrent engineering and its basic advantage over traditional sequential design. In this paper, we relate the structure of information flow in a design process to three different execution strategies: sequential, partial overlapping, and concurrent. The risks of excessive task iterations or redesigns associated with each execution pattern are probabilistically modeled. Risk and decision analysis methodology is used to determine the best execution strategy and the optimal overlapping policy for a set of activities given their information structure. Applying this theoretical framework to a real-world design application of an automotive cylinder block suggested a potential 18% reduction in development cycle time.
Operations Research | 1979
Kenneth R. Chelst; James P. Jarvis
We describe an extension of Larsons hypercube queuing model to enable it to calculate the probability distributions of travel times. In addition, we discuss the general need for this capability in emergency service models as a prerequisite for developing second-generation models that may relate direct outcome measures such as damage, loss of life, and arrests to the managerial decisions.
Interfaces | 2001
Kenneth R. Chelst; John Sidelko; Alex Przebienda; Jeffrey Lockledge; Dimitrios Mihailidis
The prototype vehicles that Ford Motor Company uses to verify new designs are a major annual investment. A team of engineering managers studying for masters degrees in a Wayne State University program taught at Ford adapted a classroom set-covering example to begin development of the prototype optimization model (POM). Ford uses the POM and its related expert systems to budget, plan, and manage prototype test fleets and to maintain testing integrity, reducing annual prototype costs by more than
Journal of the Operational Research Society | 2002
J Lockledge; Dimitrios Mihailidis; John Sidelko; Kenneth R. Chelst
250 million. POMs first use on the European Transit vehicle reduced costs by an estimated
European Journal of Operational Research | 2009
Bimal Nepal; Gregg Lassan; Baba Drow; Kenneth R. Chelst
12 million. The model dramatically shortened the planning process, established global procedures, and created a common structure for dialogue between budgeting and engineering.
Interfaces | 1998
Jonathan P. Caulkins; Edward H. Kaplan; Peter Lurie; Thomas O'Connor; Sung-Ho Ahn; Kenneth R. Chelst
Product development in the automotive industry is a complex process that involves extensive testing of components, subsystems, systems and full vehicles. A fleet of unique individually manufactured vehicles must be built and scheduled amongst different major system activities to be used in comprehensive testing programs. In this paper we present a multi-stage mathematical programming model, set covering plus scheduling, that has been used to restructure the development of the prototype fleet and the assignment of tests to specific vehicles. A basic version of the model implemented on a complex vehicle program produced a 25% reduction in fleet size as compared to the forecast originally made by the company. In addition, the model was the driver for the restructuring of the prototype planning process. In presenting this model, we will describe: (a) the model development process including structuring of the input and output to meet customer needs, (b) model structure, (c) keys to implementation success, and (d) the systems overall impact on the prototype planning process.
Queueing Systems | 1990
Kenneth R. Chelst
The sourcing decisions of microcontrollers in automotive industries are complex to manage largely due to the increasing complexity of products requirements, multiple suppliers, and the nature of microcontroller pricing structures. This paper presents a set-covering model that allows the user to select the most economical microcontrollers that meet all the critical product requirements while minimizing the total cost. The optimization process is carried out in two phases. The first phase deals with the construction of a buildable combination matrix by mapping out the critical product requirements against the microcontroller specifications. In the second phase, the model makes an optimal assignment of microcontrollers to each feasible or buildable product by utilizing economies of scales offered by large microcontroller volumes. Lot size constraints are used to handle the step function in the microcontrollers pricing structure. A case study from Visteon Corporation is used to demonstrate the application of the model. Pilot implementation of the model shows a potential saving of nearly two millions over a 4-year planning horizon.
Journal of Criminal Justice | 1984
Kenneth Weiner; Kenneth R. Chelst; William Hart
Introductory survey texts in operations research and management science have one or two chapters on decision analysis. They vary little from text to text in the topics covered, focusing largely on the analytic aspects of decision analysis. Their authors ignore key elements of decision analysis and the barriers to its use. As a result, they provide little insight as to how decision analysis is used in the real world as part of an intuition-building process and as a communication tool. In a critical review of their presentations, I identify crucial missing or cursorily reviewed topics and emphasize the need for a structured approach that incorporates probabilistic concepts and trade-offs amongst objectives. I present an annotated list of references and a new outline for a six-hour overview of decision analysis (not including multiple objectives) as part of asemester-long OR/MS survey course.
Computers & Operations Research | 1977
Kenneth R. Chelst
Cities with under 100,000 in population expend a significant portion of their budgets on emergency services. One option that a number of these cities have considered for improving service and cutting costs is training personnel to handle both police and fire roles. In this paper we describe a hierarchy of models that we have used to assess the performance viability of a merger as well as to design specific deployment plans. The modeling environment is more complex than a traditional police or fire system. We need to model the response pattern of four or more patrol units along with the simultaneous dispatch of fire equipment from one or more fire stations. The major contribution of the paper is the manner in which a series of models is linked together to forecast a wide range of performance measures under differing dispatch assumptions. We use a queueing model of police patrol to calculate steady state probabilities and expected delays without preemption. We then model two types of preemptive dispatch strategies utilized in responding initially to a major fire by superimposing a binomial distribution on the basic queueing model. There is also a travel time simulation model to calculate conditional expected response time statistics. The queueing models and the travel time simulation are then combined to estimate unconditional expected values. Lastly, we describe a simulation model used to address transient performance issues that are of concern during a major fire.