Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joe Zhu is active.

Publication


Featured researches published by Joe Zhu.


Archive | 2011

Handbook on data envelopment analysis

William W. Cooper; Lawrence M. Seiford; Joe Zhu

-Preface W.W. Cooper, L.M. Seiford, J. Zhu. -1. Data Envelopment Analysis: History, Models and Interpretations W.W. Cooper, L.M. Seiford, J. Zhu. -2. Returns to Scale in DEA: R.D. Banker, W.W. Cooper, L.M. Seiford, J. Zhu. -3. Sensitivity Analysis in DEA: W.W. Cooper, Shanling Li, L.M. Seiford, J. Zhu. -4. Incorporating Value Judgments in DEA: E. Thanassoulis, M.C. Portela, R. Allen. -5. Distance Functions with Applications to DEA R. Fare, S. Grosskopf, G. Whittaker. -6. Qualitative Data in DEA W.D. Cook. -7. Congestion: Its Identification and Management with DEA W.W. Cooper, Honghui Deng, L.M. Seiford, J. Zhu. -8. Malmquist Productivity Index: Efficiency Change Over Time K. Tone. -9. Chance Constrained DEA: W.W. Cooper, Zhimin Huang, S.X. Li. -10. Performance of the Bootstrap for DEA Estimators and Iterating the Principle: L. Simar, P.W.Wilson. -11. Statistical Tests Based on DEA Efficiency Scores R.D. Banker, R. Natarajan. -12. Performance Evaluation in Education: Modeling Educational Production J. Ruggiero. -13. Assessing Bank and Bank Branch Performance: Modeling Considerations and Approaches J.C. Paradi, S. Vela, Zijiang Yang. -14. Engineering Applications of Data Envelopment Analysis: Issues and Opportunities: K.P. Triantis. -15. Benchmarking in Sports: Bonds or Ruth: Determining the Most Dominant Baseball Batter Using DEA T.R. Anderson. -16. Assessing the Selling Function in Retailing: Insights from Banking, Sales forces, Restaurants & Betting shops A.D. Athanassopoulos. -17. Health Care Applications: From Hospitals to Physicians, From Productive Efficiency to Quality Frontiers: J.A. Chilingerian, H.D. Sherman. -18. DEA Software Tools and Technology: A State-of-the-Art Survey R. Barr. -Notes about Authors. Author Index. Subject Index.


European Journal of Operational Research | 2002

Modeling undesirable factors in efficiency evaluation

Lawrence M. Seiford; Joe Zhu

Abstract Data envelopment analysis (DEA) measures the relative efficiency of decision making units (DMUs) with multiple performance factors which are grouped into outputs and inputs. Once the efficient frontier is determined, inefficient DMUs can improve their performance to reach the efficient frontier by either increasing their current output levels or decreasing their current input levels. However, both desirable (good) and undesirable (bad) factors may be present. For example, if inefficiency exists in production processes where final products are manufactured with a production of wastes and pollutants, the outputs of wastes and pollutants are undesirable and should be reduced to improve the performance. Using the classification invariance property, we show that the standard DEA model can be used to improve the performance via increasing the desirable outputs and decreasing the undesirable outputs. The method can also be applied to situations when some inputs need to be increased to improve the performance. The linearity and convexity of DEA are preserved through our proposal.


European Journal of Operational Research | 2000

Multi-factor performance measure model with an application to Fortune 500 companies

Joe Zhu

Abstract The paper develops tools for reconciling diverse measures which characterize the financial performance of the Fortune 500 companies. The technology of data envelopment analysis (DEA) is employed to determine a multi-factor financial performance model which inherently recognizes tradeoffs among various financial measures. This study offers an alternative perspective and characterization on the performance of the Fortune 500 companies. It is shown that the top-ranked companies by revenue do not necessarily have top-ranked performance viewed as being multidimensional. Only about 3% companies were operating on the best-practice frontier. Substantial technical and scale inefficiencies are found. Decreasing returns to scale (DRS) are uncovered among the relatively large (revenue-top-ranked) companies. The study of congestion shows that a reduction in current levels of employees, assets and equity may actually increase revenue and profit levels. Factor-specific measures, within the framework of multidimensional measure, are developed to further study the performance of companies and industries. The performance of best-practice frontier companies is analyzed by constructing reference-share measures which indicate the role each frontier company plays in evaluating non-frontier companies. Finally, the reliability of the best-practice frontier is examined.


Information Technology & Management | 2004

Measuring Information Technology's Indirect Impact on Firm Performance

Yao Chen; Joe Zhu

It has been recognized that the link between information technology (IT) investment and firm performance is indirect due to the effect of mediating and moderating variables. For example, in the banking industry, the IT-value added activity helps to effectively generate funds from the customer in the forms of deposits. Profits then are generated by using deposits as a source of investment funds. Traditional efficiency models, such as data envelopment analysis (DEA), can only measure the efficiency of one specific stage when a two-stage production process is present. We develop an efficiency model that identifies the efficient frontier of a two-stage production process linked by intermediate measures. A set of firms in the banking industry is used to illustrate how the new model can be utilized to (i) characterize the indirect impact of IT on firm performance, (ii) identify the efficient frontier of two principal value-added stages related to IT investment and profit generation, and (iii) highlight those firms that can be further analyzed for best practice benchmarking.


European Journal of Operational Research | 2009

Additive efficiency decomposition in two-stage DEA

Yao Chen; Wade D. Cook; Ning Li; Joe Zhu

Kao and Hwang (2008) [Kao, C., Hwang, S.-N., 2008. Efficiency decomposition in two-stage data envelopment analysis: An application to non-life insurance companies in Taiwan. European Journal of Operational Research 185 (1), 418-429] develop a data envelopment analysis (DEA) approach for measuring efficiency of decision processes which can be divided into two stages. The first stage uses inputs to generate outputs which become the inputs to the second stage. The first stage outputs are referred to as intermediate measures. The second stage then uses these intermediate measures to produce outputs. Kao and Huang represent the efficiency of the overall process as the product of the efficiencies of the two stages. A major limitation of this model is its applicability to only constant returns to scale (CRS) situations. The current paper develops an additive efficiency decomposition approach wherein the overall efficiency is expressed as a (weighted) sum of the efficiencies of the individual stages. This approach can be applied under both CRS and variable returns to scale (VRS) assumptions. The case of Taiwanese non-life insurance companies is revisited using this newly developed approach.


Archive | 2011

Data Envelopment Analysis: History, Models, and Interpretations

William W. Cooper; Lawrence M. Seiford; Joe Zhu

In about 30 years, Data Envelopment Analysis (DEA) has grown into a powerful quantitative, analytical tool for measuring and evaluating the performance. DEA has been successfully applied to a host of many different types of entities engaged in a wide variety of activities in many contexts worldwide. This chapter discusses the basic DEA models and some of their extensions.


Annals of Operations Research | 2006

DEA models for supply chain efficiency evaluation

Liang Liang; Feng Yang; Wade D. Cook; Joe Zhu

An appropriate performance measurement system is an important requirement for the effective management of a supply chain. Two hurdles are present in measuring the performance of a supply chain and its members. One is the existence of multiple measures that characterize the performance of chain members, and for which data must be acquired; the other is the existence of conflicts between the members of the chain with respect to specific measures. Conventional data envelopment analysis (DEA) cannot be employed directly to measure the performance of supply chain and its members, because of the existence of the intermediate measures connecting the supply chain members. In this paper it is shown that a supply chain can be deemed as efficient while its members may be inefficient in DEA-terms. The current study develops several DEA-based approaches for characterizing and measuring supply chain efficiency when intermediate measures are incorporated into the performance evaluation. The models are illustrated in a seller-buyer supply chain context, when the relationship between the seller and buyer is treated first as one of leader-follower, and second as one that is cooperative. In the leader-follower structure, the leader is first evaluated, and then the follower is evaluated using information related to the leaders efficiency. In the cooperative structure, the joint efficiency which is modelled as the average of the sellers and buyers efficiency scores is maximized, and both supply chain members are evaluated simultaneously. Non-linear programming problems are developed to solve these new supply chain efficiency models. It is shown that these DEA-based non-linear programs can be treated as parametric linear programming problems, and best solutions can be obtained via a heuristic technique. The approaches are demonstrated with a numerical example.


Infor | 1999

Infeasibility Of Super-Efficiency Data Envelopment Analysis Models

Lawrence M. Seiford; Joe Zhu

AbstractThe paper investigates the infeasibility of super-efficiency data envelopment analysis (DEA) models in which the unit under evaluation is excluded from the reference set. Necessary and sufficient conditions are provided for infeasibility of the super-efficiency DEA measures. By the returns to scale (RTS) classifications obtained from the standard DEA model, we can further locate the position of the unit under evaluation when infeasibility occurs. It is shown that the ranking of the total set of efficient DMUs is impossible because of the infeasibility of super-efficiency DEA models. Also we are able to identify the endpoint positions of the extreme efficient units. The results are useful for sensitivity analysis of efficiency classifications.


European Journal of Operational Research | 2004

Returns to scale in different DEA models

Rajiv D. Banker; William W. Cooper; Lawrence M. Seiford; Robert M. Thrall; Joe Zhu

Abstract This paper discusses returns to scale (RTS) in data envelopment analysis (DEA) for each of the presently available types of models. The BCC and CCR models are treated in input oriented forms while the multiplicative model is treated in output oriented form. (This distinction is not pertinent for the additive model which simultaneously maximizes outputs and minimizes inputs in the sense of a vector optimization.) Quantitative estimates in the form of scale elasticities are treated in the context of multiplicative models, but the bulk of the discussion is confined to qualitative characterizations such as whether RTS is identified as increasing, decreasing or constant. This is discussed for each type of model and relations between the results for the different models are established. The opening section describes and delimits approaches to be examined. The concluding section outlines further opportunities for research.


European Journal of Operational Research | 2003

Imprecise data envelopment analysis (IDEA): A review and improvement with an application

Joe Zhu

Abstract The standard data envelopment analysis (DEA) method requires that the values for all inputs and outputs be known exactly. When some outputs and inputs are unknown decision variables such as bounded data, ordinal data, and ratio bounded data, the DEA model becomes a non-linear programming problem and is called imprecise DEA (IDEA). There are two different approaches in dealing with imprecise outputs and inputs. One uses scale transformations and variable alternations to convert the non-linear IDEA model into a linear program. The other converts imprecise data into exact data and then uses the standard linear DEA model. The current paper reviews and compares the two approaches through an efficiency analysis of a set of telephone offices. A simplified approach is developed to reduce the computational burden if one uses the first approach. The treatment of weight restrictions in IDEA is discussed. It is shown that weight restrictions on imprecise data are redundant. New developments and improvements to both approaches are provided.

Collaboration


Dive into the Joe Zhu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yao Chen

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Liang Liang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

William W. Cooper

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge