Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Wade D. Cook is active.

Publication


Featured researches published by Wade D. Cook.


European Journal of Operational Research | 2009

Data envelopment analysis (DEA) – Thirty years on

Wade D. Cook; Lawrence M. Seiford

This paper provides a sketch of some of the major research thrusts in data envelopment analysis (DEA) over the three decades since the appearance of the seminal work of Charnes et al. (1978) [Charnes, A., Cooper, W.W., Rhodes, E.L., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429-444]. The focus herein is primarily on methodological developments, and in no manner does the paper address the many excellent applications that have appeared during that period. Specifically, attention is primarily paid to (1) the various models for measuring efficiency, (2) approaches to incorporating restrictions on multipliers, (3) considerations regarding the status of variables, and (4) modeling of data variation.


European Journal of Operational Research | 2009

Additive efficiency decomposition in two-stage DEA

Yao Chen; Wade D. Cook; Ning Li; Joe Zhu

Kao and Hwang (2008) [Kao, C., Hwang, S.-N., 2008. Efficiency decomposition in two-stage data envelopment analysis: An application to non-life insurance companies in Taiwan. European Journal of Operational Research 185 (1), 418-429] develop a data envelopment analysis (DEA) approach for measuring efficiency of decision processes which can be divided into two stages. The first stage uses inputs to generate outputs which become the inputs to the second stage. The first stage outputs are referred to as intermediate measures. The second stage then uses these intermediate measures to produce outputs. Kao and Huang represent the efficiency of the overall process as the product of the efficiencies of the two stages. A major limitation of this model is its applicability to only constant returns to scale (CRS) situations. The current paper develops an additive efficiency decomposition approach wherein the overall efficiency is expressed as a (weighted) sum of the efficiencies of the individual stages. This approach can be applied under both CRS and variable returns to scale (VRS) assumptions. The case of Taiwanese non-life insurance companies is revisited using this newly developed approach.


Iie Transactions | 1991

Controlling Factor Weights in Data Envelopment Analysis

Yaakov Roll; Wade D. Cook; Boaz Golany

Abstract Data Envelopment Analysis (DEA) is a mathematical programming approach to assessing relative efficiencies within a group of Decision Making Units (DMUs). An important outcome of such an analysis is a set of virtual multipliers or weights accorded to each (input or output) factor taken into account. These sets of weights are, typically, different for each of the participating DMUs. A version of the DEA model is offered where bounds are imposed on weights, thus reducing the variation in the importance accorded to the same factor by the various DMUs. Techniques for locating appropriate bounds are suggested and the notion of a common set of weights is examined. Possible interpretations to differences in efficiency ratings obtained with the various models developed are discussed.


Annals of Operations Research | 2006

DEA models for supply chain efficiency evaluation

Liang Liang; Feng Yang; Wade D. Cook; Joe Zhu

An appropriate performance measurement system is an important requirement for the effective management of a supply chain. Two hurdles are present in measuring the performance of a supply chain and its members. One is the existence of multiple measures that characterize the performance of chain members, and for which data must be acquired; the other is the existence of conflicts between the members of the chain with respect to specific measures. Conventional data envelopment analysis (DEA) cannot be employed directly to measure the performance of supply chain and its members, because of the existence of the intermediate measures connecting the supply chain members. In this paper it is shown that a supply chain can be deemed as efficient while its members may be inefficient in DEA-terms. The current study develops several DEA-based approaches for characterizing and measuring supply chain efficiency when intermediate measures are incorporated into the performance evaluation. The models are illustrated in a seller-buyer supply chain context, when the relationship between the seller and buyer is treated first as one of leader-follower, and second as one that is cooperative. In the leader-follower structure, the leader is first evaluated, and then the follower is evaluated using information related to the leaders efficiency. In the cooperative structure, the joint efficiency which is modelled as the average of the sellers and buyers efficiency scores is maximized, and both supply chain members are evaluated simultaneously. Non-linear programming problems are developed to solve these new supply chain efficiency models. It is shown that these DEA-based non-linear programs can be treated as parametric linear programming problems, and best solutions can be obtained via a heuristic technique. The approaches are demonstrated with a numerical example.


Journal of Productivity Analysis | 2000

Multicomponent Efficiency Measurement and Shared Inputs in Data Envelopment Analysis: An Application to Sales and Service Performance in Bank Branches

Wade D. Cook; Moez Hababou; Hans J. H. Tuenter

In most applications ofDEA presented in the literature, the models presented are designedto obtain a single measure of efficiency. In many instances however,the decision making units involved may perform several differentand clearly identifiable functions, or can be separated intodifferent components. In such situations, inputs, in particularresources, are often shared among those functions. This sharingphenomenon will commonly present the technical difficulty ofhow to disaggregate an overall measure into component parts.In the present paper, we extend the usual DEA structure to onethat determines a best resource split to optimize the aggregateefficiency score. The particular application area investigatedis that involving the sales and service functions within thebranches of a bank. An illustrative application of the methodologyto a sample of branches from a major Canadian bank is given.


European Journal of Operational Research | 2010

Network DEA: Additive efficiency decomposition

Wade D. Cook; Joe Zhu; Gongbing Bi; Feng Yang

In conventional DEA analysis, DMUs are generally treated as a black-box in the sense that internal structures are ignored, and the performance of a DMU is assumed to be a function of a set of chosen inputs and outputs. A significant body of work has been directed at problem settings where the DMU is characterized by a multistage process; supply chains and many manufacturing processes take this form. Recent DEA literature on serial processes has tended to concentrate on closed systems, that is, where the outputs from one stage become the inputs to the next stage, and where no other inputs enter the process at any intermediate stage. The current paper examines the more general problem of an open multistage process. Here, some outputs from a given stage may leave the system while others become inputs to the next stage. As well, new inputs can enter at any stage. We then extend the methodology to examine general network structures. We represent the overall efficiency of such a structure as an additive weighted average of the efficiencies of the individual components or stages that make up that structure. The model therefore allows one to evaluate not only the overall performance of the network, but as well represent how that performance decomposes into measures for the individual components of the network. We illustrate the model using two data sets.


Operations Research Letters | 1987

Consistent weights for judgements matrices of the relative importance of alternatives

Jonathan Barzilai; Wade D. Cook; Boaz Golany

We prove that the only solution satisfying consistency axioms for the problem of retrieving weights from inconsistent judgements matrices whose entries are the relative importance ratios of alternatives is the geometric mean.


Operations Research | 2008

The DEA Game Cross-Efficiency Model and Its Nash Equilibrium

Liang Liang; Jie Wu; Wade D. Cook; Joe Zhu

In this paper, we examine the cross-efficiency concept in data envelopment analysis (DEA). Cross efficiency links one decision-making units (DMU) performance with others and has the appeal that scores arise from peer evaluation. However, a number of the current cross-efficiency approaches are flawed because they use scores that are arbitrary in that they depend on a particular set of optimal DEA weights generated by the computer code in use at the time. One set of optimal DEA weights (possibly out of many alternate optima) may improve the cross efficiency of some DMUs, but at the expense of others. While models have been developed that incorporate secondary goals aimed at being more selective in the choice of optimal multipliers, the alternate optima issue remains. In cases where there is competition among DMUs, this situation may be seen as undesirable and unfair. To address this issue, this paper generalizes the original DEA cross-efficiency concept to game cross efficiency. Specifically, each DMU is viewed as a player that seeks to maximize its own efficiency, under the condition that the cross efficiency of each of the other DMUs does not deteriorate. The average game cross-efficiency score is obtained when the DMUs own maximized efficiency scores are averaged. To implement the DEA game cross-efficiency model, an algorithm for deriving the best (game cross-efficiency) scores is presented. We show that the optimal game cross-efficiency scores constitute a Nash equilibrium point.


European Journal of Operational Research | 2006

Distance-based and ad hoc consensus models in ordinal preference ranking

Wade D. Cook

Abstract This paper examines the problem of aggregating ordinal preferences on a set of alternatives into a consensus. This problem has been the subject of study for more than two centuries and many procedures have been developed to create a compromise or consensus. We examine a variety of structures for preference specification, and in each case review the related models for deriving a consensus. Two classes of consensus models are discussed, namely ad hoc methods, evolving primarily from parliamentary settings over the past 200 years, and distance or axiomatic-based methods. We demonstrate the levels of complexity of the various distance-based models by presenting the related mathematical programming formulations for them. We also present conditions for equivalence, that is, for yielding the same consensus ranking for some of the methods. Finally, we discuss various extensions of the basic ordinal ranking structures, paying specific attention to partial ranking, voting member weighted consensus, ranking with intensity of preference, and rank correlation methods, as alternative approaches to deriving a consensus. Suggestions for future research directions are given.


Omega-international Journal of Management Science | 2001

Sales performance measurement in bank branches

Wade D. Cook; Moez Hababou

Studies of bank branch performance have, to date, concentrated on obtaining a single perspective of efficiency. As the financial services industry has intensified, banks have increasingly engated in a proactive, differentiated and customer-based strategy in retail banking in which the sales component of the bank branch activity is emphasized. With the emerging sales culture within banks, as discussed earlier, there is a need to evaluate both sales and service performance. Cook et al. [12] have proposed a model to evaluate simultaneously the sales, service, and aggregate efficiencies of a bank branch. This model accounted for the fact that inputs, in particular resources, are often shared among these functions. In this paper, we extend the data envelopment analysis additive model using goal programming concepts. We thereby derive optimal efficiency scores while taking into account non-volume related activities, that is those involving resources that cannot be assigned to a specific input or output. Again, the proposed model derives an optimal split of the shared resources that maximizes the aggregate efficiency.

Collaboration


Dive into the Wade D. Cook's collaboration.

Top Co-Authors

Avatar

Joe Zhu

Worcester Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar

Moshe Kress

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Liang Liang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yao Chen

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yaakov Roll

Technion – Israel Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge