Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sebastián Lozano is active.

Publication


Featured researches published by Sebastián Lozano.


Journal of Productivity Analysis | 2004

CENTRALIZED RESOURCE ALLOCATION USING DATA ENVELOPMENT ANALYSIS

Sebastián Lozano; Gabriel Villa

While conventional DEA models set targets separately for each DMU, in this paper we consider that there is a centralized decision maker (DM) who “owns” or supervises all the operating units. In such intraorganizational scenario the DM has an interest in maximizing the efficiency of individual units at the same time that total input consumption is minimized or total output production is maximized. Two new DEA models are presented for such resource allocation. One type of model seeks radial reductions of the total consumption of every input while the other type seeks separate reductions for each input according to a preference structure. In both cases, total output production is guaranteed not to decrease. The two key features of the proposed models are their simplicity and the fact that both of them project all DMUs onto the efficient frontier. The dual formulation shows that optimizing total input consumption and output production is equivalent to finding weights that maximize the relative efficiency of a virtual DMU with average inputs and outputs. A graphical interpretation as well as numerical results of the proposed models are presented.


European Journal of Operational Research | 2013

Cooperative game theory approach to allocating benefits of horizontal cooperation

Sebastián Lozano; Plácido Moreno; Belarmino Adenso-Díaz; E. Algaba

Logistics costs in general, and transportation costs in particular, represent a large fraction of the operating costs of many companies. One way to try to reduce these costs is through horizontal cooperation among shippers. Thus, when the transportation needs of two or more companies are merged, their collective transportation requirements can be met at lower cost. The attainable cost savings are due to economies of scale, which translate into cheaper rates due to increased negotiation power, use of larger vehicles and bundling of shipments. In this paper, a linear model is presented and used to study the cost savings that different companies may achieve when they merge their transportation requirements. On the one hand, solving this optimization model for different collaboration scenarios allows testing and quantifying the synergies among different potential partners, thus identifying the most profitable collaboration opportunities. On the other, the problem of allocating the joint cost savings of the cooperation is tackled using cooperative game theory. The proposed approach is illustrated with an example in which different cooperative game solution concepts are compared. Extensive numerical experiments have also been carried out to gain insight into the properties of the corresponding cost savings game and the behavior of the different solution concepts.


Computers & Operations Research | 2011

Slacks-based measure of efficiency of airports with airplanes delays as undesirable outputs

Sebastián Lozano; Ester Gutiérrez

This paper reports the slacks-based measure (SBM) of efficiency of 39 Spanish airports for years 2006 and 2007. In addition to the conventional outputs (namely aircraft traffic movements, passenger movements and cargo handled), two undesirable outputs have been considered: percentage of delayed flights and average conditional delay of delayed flights. The inputs considered quantify the physical infrastructure of the airports and are considered non-discretionary. The proposed Data Envelopment Analysis (DEA) approach assumes variable returns to scale and joint weak disposability of the desirable and undesirable outputs. The SBM model used has been found to have more discriminatory power than the common directional distance function approach. Also, the inclusion in the analysis of the undesirable effects of airport operations leads to more valid results. The results show that in both years more than half of the airports are technical efficient with the rest showing in general large inefficiencies due to slacks in the different outputs, slacks that the proposed SBM approach is able to identify and quantify. Overall, the system has significant improvement potential in cargo and to a less extent in passengers and percentage of delayed flights.


Science of The Total Environment | 2009

The link between operational efficiency and environmental impacts: A joint application of Life Cycle Assessment and Data Envelopment Analysis

Sebastián Lozano; Diego Iribarren; Ma Teresa Moreira; Gumersindo Feijoo

Life Cycle Assessment (LCA) allows the estimation of the environmental impacts of a process or product. Those environmental impacts depend on the efficiency with which operations are carried out. In the case that LCA data are available for multiple similar installations, their respective operational performances can be benchmarked and links between operational efficiency and environmental impacts can be established. In this paper, this possibility is illustrated with a case study on LCA of mussel cultivation in rafts. For each site (raft) both its inputs consumption and mussel production are known. A separate LCA of each site has been performed and its corresponding environmental impacts have been estimated. Using Data Envelopment Analysis (DEA) on the input/output data allows computing the relative efficiency of each mussel raft and setting appropriate efficiency targets. The DEA targets represent virtual cultivation sites, which consume less input and/or produce more output. The performance of an LCA study for each of these virtual cultivation sites and the comparison between their environmental impacts are used to estimate the environmental impacts consequences of operational inefficiencies. This direct link can help to convince the managers and operators of the cultivation sites of the double dividend of reducing inputs consumption and achieve operational efficiency: lower costs and lower environmental impacts.


Journal of the Operational Research Society | 2002

Measuring the performance of nations at the Summer Olympics using data envelopment analysis

Sebastián Lozano; Gabriel Villa; Fernando Guerrero; Pablo Cortés

In this paper a well known tool for relative efficiency assessment, namely Data Envelopment Analysis (DEA), is used to measure the performance of the nations participating at the last five Summer Olympic games. The proposed approach considers two inputs (GNP and population) and three outputs (number of gold, silver and bronze medals won). To increase the consistency of the results, weight restrictions are included, guaranteeing a higher valuation for gold medals than for silver medals and higher for the latter than for bronze medals. Variable returns to scale are assumed. The results for the last five Summer Olympics are analysed. For each of them, a performance index as well as benchmarks are computed for each country. In addition, plotting the performance of a specific country for the different games can help identify trends as well as objective successes and disappointments.


european conference on parallel processing | 2002

Parallel Fuzzy c-Means Clustering for Large Data Sets

Terence Kwok; Kate A. Smith; Sebastián Lozano; David Taniar

The parallel fuzzy c-means (PFCM) algorithm for clustering large data sets is proposed in this paper. The proposed algorithm is designed to run on parallel computers of the Single Program Multiple Data (SPMD) model type with the Message Passing Interface (MPI). A comparison is made between PFCM and an existing parallel k-means (PKM) algorithm in terms of their parallelisation capability and scalability. In an implementation of PFCM to cluster a large data set from an insurance company, the proposed algorithm is demonstrated to have almost ideal speedups as well as an excellent scaleup with respect to the size of the data sets.


European Journal of Operational Research | 2009

Centralised reallocation of emission permits using DEA

Sebastián Lozano; Gabriel Villa; Runar Brännlund

In this paper a data envelopment analysis (DEA) approach to the problem of emission permits reallocation is presented. It can be used with conventional command and control as well as with an allowance market. It uses a centralized point of view, which represents the common good. In the model it is assumed that firms produce two types of outputs: desirable outputs (i.e. good outputs with positive value for consumers) and undesirable outputs (i.e. bad outputs with negative value for consumers, such as emissions of pollutants). The proposed approach has three phases, which correspond to three objectives that are pursued lexicographically. The three objectives are maximizing aggregated desirable production, minimizing undesirable total emissions and minimizing the consumption of input resources. The relative priority of these objectives is defined by the regulator. The whole approach is units-invariant and does not require information on input and output prices. The approach is applied on a dataset from the Swedish pulp and paper industry.


soft computing | 2012

Metaheuristic optimization frameworks: a survey and benchmarking

José Antonio Parejo; Antonio Ruiz-Cortés; Sebastián Lozano; Pablo Fernandez

This paper performs an unprecedented comparative study of Metaheuristic optimization frameworks. As criteria for comparison a set of 271 features grouped in 30 characteristics and 6 areas has been selected. These features include the different metaheuristic techniques covered, mechanisms for solution encoding, constraint handling, neighborhood specification, hybridization, parallel and distributed computation, software engineering best practices, documentation and user interface, etc. A metric has been defined for each feature so that the scores obtained by a framework are averaged within each group of features, leading to a final average score for each framework. Out of 33 frameworks ten have been selected from the literature using well-defined filtering criteria, and the results of the comparison are analyzed with the aim of identifying improvement areas and gaps in specific frameworks and the whole set. Generally speaking, a significant lack of support has been found for hyper-heuristics, and parallel and distributed computing capabilities. It is also desirable to have a wider implementation of some Software Engineering best practices. Finally, a wider support for some metaheuristics and hybridization capabilities is needed.


European Journal of Operational Research | 2008

Data envelopment analysis of mutual funds based on second-order stochastic dominance

Sebastián Lozano; Ester Gutiérrez

Although data envelopment analysis (DEA) has been extensively used to assess the performance of mutual funds (MF), most of the approaches overestimate the risk associated to the endogenous benchmark portfolio. This is because in the conventional DEA technology the risk of the target portfolio is computed as a linear combination of the risk of the assessed MF. This neglects the important effects of portfolio diversification. Other approaches based on mean-variance or mean-variance-skewness are non-linear. We propose to combine DEA with stochastic dominance criteria. Thus, in this paper, six distinct DEA-like linear programming (LP) models are proposed for computing relative efficiency scores consistent (in the sense of necessity) with second-order stochastic dominance (SSD). The aim is that, being SSD efficient, the obtained target portfolio should be an optimal benchmark for any rational risk-averse investor. The proposed models are compared with several related approaches from the literature.


Computers & Industrial Engineering | 2001

Machine cell formation in generalized group technology

Belarmino Adenso-Díaz; Sebastián Lozano; Jesús Racero; Fernando Guerrero

In this paper, we study the configuration of machine cells in the presence of alternative routings for part types. The objective function is the minimization of transportation costs. Limits on cell sizes as well as separation constraints (i.e. machines that are not allowed to be placed in the same cell) and co-location constraints (i.e. machines that must be placed in the same cell) may be imposed. An efficient Tabu Search (TS) algorithm is proposed to solve this problem. Extensive computational experiences with large-size problems show that this method outperforms some existing Simulated Annealing (SA) approaches.

Collaboration


Dive into the Sebastián Lozano's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge