Niels Christian Petersen
University of Southern Denmark
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Niels Christian Petersen.
Journal of Productivity Analysis | 2003
Ole Bent Olesen; Niels Christian Petersen
This paper provides an outline of possible uses of complete information on the facial structure of a polyhedral empirical production possibility set obtained by DEA. It is argued that an identification of all facets can be used for a characterization of basic properties of the empirical production frontier. Focus is on the use of this type of information for (i) the specification of constraints on the virtual multipliers in a cone-ratio model, (ii) a characterization of the data generation process for the underlying observed data set, and (iii) the estimation of isoquants and relevant elasticities of substitution reflecting the curvature of the frontier. The relationship between the so-called FDEF approach and the cone-ratio model is explored in some detail. It is demonstrated that a decomposition of the facet generation process followed by the use of one of the available (exponential) convex hull algorithms allows for an explicit identification of the facial structure of the possibility set in fairly large DEA data sets. It is a main point to be made that the difficulties encountered for an identification of all facets in a DEA-possibility set can be circumvented in a number of empirical data sets and that this type of information can be used for a characterization of the structural properties of the frontier.
Computers & Operations Research | 1996
Ole Bent Olesen; Niels Christian Petersen
Abstract The paper is concerned with a discussion of advantages and drawbacks using standard vs special optimizers for solving DEA-problems. The use of standard optimizers is recommended. It is argued that GAMS provides a highly appropriate framework for building and solving DEA-models. The successive development of DEA-models along with the flexible tools for model building and the large number of standard solvers available with GAMS are the basic arguments in favour of the use of GAMS rather than a specialized optimizer for DEA-problems. A representation of the CCR-model along with some of its extensions is included as a selection of GAMS statements; the GAMS statements provide a ready for use framework for DEA in GAMS.
European Journal of Operational Research | 2016
Ole Bent Olesen; Niels Christian Petersen
This paper provides a review of stochastic Data Envelopment Analysis (DEA). We discuss extensions of deterministic DEA in three directions: (i) deviations from the deterministic frontier are modeled as stochastic variables, (ii) random noise in terms of measurement errors, sample noise, and specification errors is made an integral part of the model, and (iii) the frontier is stochastic as is the underlying Production Possibility Set (PPS).
International Journal of Production Economics | 1995
Ole Bent Olesen; Niels Christian Petersen
The study is concerned with the incorporation of information on differences in quality into DEA. Three approaches for incorporation of quality in efficiency evaluations and corresponding criteria for efficiency are suggested. The three criteria are operationalized for use in empirical analyses in a number of LP-models in the DEA-tradition and the links between the models and the microeconomic production theory are established. A relationship between dominance criteria designed for an evaluation of quality and quantity performance and the notion of weak disposability is shown to exist within the context of estimation of production possibility sets. Preliminary results of an efficiency analysis of public primary schools in Denmark based upon the suggested framework for quality-controlled efficiency evaluations are reported.
European Journal of Operational Research | 2015
Ole Bent Olesen; Niels Christian Petersen; Victor V. Podinovski
In applications of data envelopment analysis (DEA) data about some inputs and outputs is often available only in the form of ratios such as averages and percentages. In this paper we provide a positive answer to the long-standing debate as to whether such data could be used in DEA. The problem arises from the fact that ratio measures generally do not satisfy the standard production assumptions, e.g., that the technology is a convex set. Our approach is based on the formulation of new production assumptions that explicitly account for ratio measures. This leads to the estimation of production technologies under variable and constant returns-to-scale assumptions in which both volume and ratio measures are native types of data. The resulting DEA models allow the use of ratio measures “as is”, without any transformation or use of the underlying volume measures. This provides theoretical foundations for the use of DEA in applications where important data are reported in the form of ratios.
Journal of Productivity Analysis | 1999
Ole Bent Olesen; Niels Christian Petersen
The paper is concerned with the incorporation of polyhedral cone constraints on the virtual multipliers in DEA. The incorporation of probabilistic bounds on the virtual multipliers based upon a stochastic benchmark vector is demonstrated. The suggested approach involves a stochastic (chance constrained) programming model with multipliers constrained to the cone spanned by confidence intervals for the components of the stochastic benchmark vector at varying probability levels. Consider a polyhedral assurance region based upon bounded pairwise ratios between multipliers. It is shown that in general it is never possible to identify a “center-vector” defined as a vector in the interior of the cone with identical angles to all extreme rays spanning the cone. Smooth cones are suggested if an asymmetric variation in the set of feasible relative prices is to be avoided.
European Journal of Operational Research | 2017
Ole Bent Olesen; Niels Christian Petersen; Victor V. Podinovski
In a recent paper to this journal, the authors developed a methodology that allows the incorporation of ratio inputs and outputs in the variable and constant returns-to-scale DEA models. Practical evaluation of efficiency of decision making units (DMUs) in such models generally goes beyond the application of standard linear programming techniques. In this paper we discuss how the DEA models with ratio measures can be solved. We also introduce a new type of potential ratio (PR) inefficiency. It characterizes DMUs that are strongly efficient in the model of technology with ratio measures but become inefficient if the volume data used to calculate ratio measures become available. Potential ratio inefficiency can be tested by the programming approaches developed in this paper.
Scandinavian Journal of Trauma, Resuscitation and Emergency Medicine | 2012
Dawid Kozlowski; Christian Backer Mogensen; Niels Christian Petersen
Background The process of ongoing centralization of acute hospitals in Denmark enforces a number of challenges to be addressed. Staff levels must be adjusted not only in order to fulfil the new rules for specialists’ presence at the Emergency Department (ED), but also to meet the imposed time requirements regarding the first patient contact with a specialist along with the requirement of making an action diagnosis within 4 hours after arrival. Patient arrivals and pathways are grouped in accordance with the so called Standardized Time-based Patient Pathways (STPs) under development. The main goal of the project is the development of an analytical tool designed to facilitate qualified decisionmaking with respect to the dimensioning of the ED with respect to e.g. staffing in view of the imposed time requirements.
Journal of Productivity Analysis | 2001
Niels Christian Petersen
Bouhnik, Golany, Hackman, Passy & Vlatsa (BGHPV) propose in this issue an extension to the basic CCRand BCC-models so that if an observed DMU is scaled for the construction of a reference unit or a composite DMU, then the scaled DMU must be at least as large as a pre-defined lower bound. The introduction of lower bounds on the scale at which members in a set of observed DMUs operate when defining a composite unit may well be reasonable in some applications including those mentioned by BGHPV:
Computers & Operations Research | 2018
Troels Martin Range; Dawid Kozlowski; Niels Christian Petersen
Abstract The knapsack problem (KP) is concerned with the selection of a subset of multiple items with known positive values and weights such that the total value of selected items is maximized and their total weight does not exceed capacity. Item values, item weights, and capacity are known in the deterministic case. We consider the stochastic KP (SKP) with stochastic item weights. For this variant of the SKP we combine the chance constrained KP (CCKP) and the SKP with simple recourse (SRKP). The chance constraint allows for a violation of capacity, but the probability of a violation beyond an imposed limit is constrained. The violation of the capacity constraint is also included in the objective function in terms of a penalty function as in the SRKP. Penalty is an increasing function of the expected number of units of violation with proportionality as a special case. We formulate the SKP as a network problem and demonstrate that it can be solved by a label-setting dynamic programming approach for the shortest path problem with resource constraints (SPPRC). We develop a dominance criterion for an elimination of states in the dynamic programming approach using only the deterministic value of items along with mean and variance of the stochastic weight of items corresponding to the associated paths in the underlying network. It is shown that a lower bound for the impact of potential extensions of paths is available as an additional means to limit the number of states provided the penalty cost of expected overtime is convex. Our findings are documented in terms of a computational study.