Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Julio López is active.

Publication


Featured researches published by Julio López.


Expert Systems With Applications | 2013

Support vector machine under uncertainty: An application for hydroacoustic classification of fish-schools in Chile

Paul Bosch; Julio López; Hector Ramirez; Hugo Robotham

In this work we apply multi-class support vector machines (SVMs) and a multi-class stochastic SVM formulation to the classification of fish schools of three species: anchovy, common sardine, and Jack Mackerel, and we compare their performance. The data used come from acoustic measurements in southern-central Chile. These classifications were carried out by using a diver set of descriptors including morphology, bathymetry, energy, and space positions. In both type of formulations, the deterministic and the stochastic one, the strategy used to classify multi-class SVM consists in employing the criterion one-species-against-the-Rest. We thus provide an empirical way to adjust the parameters involved in the stochastic classifiers with the aim of improving its performance. When this procedure is applied to the classification of fish schools we obtain a classifier with a better performance than the deterministic classifier.


Expert Systems With Applications | 2016

A second-order cone programming formulation for nonparallel hyperplane support vector machine

Miguel Carrasco; Julio López; Sebastián Maldonado

Novel robust SVM approach based on second-order cone programming.An extension for the method Nonparallel Hyperplane SVM is proposed.A geometrically grounded method based on the concept of ellipsoids.Superior classification performance is achieved in experiments on benchmark datasets. Expert systems often rely heavily on the performance of binary classification methods. The need for accurate predictions in artificial intelligence has led to a plethora of novel approaches that aim at correctly predicting new instances based on nonlinear classifiers. In this context, Support Vector Machine (SVM) formulations via two nonparallel hyperplanes have received increasing attention due to their superior performance. In this work, we propose a novel formulation for the method, Nonparallel Hyperplane SVM. Its main contribution is the use of robust optimization techniques in order to construct nonlinear models with superior performance and appealing geometrical properties. Experiments on benchmark datasets demonstrate the virtues in terms of predictive performance compared with various other SVM formulations. Managerial insights and the relevance for intelligent systems are discussed based on the experimental outcomes.


Pattern Recognition | 2015

A multi-class SVM approach based on the l1-norm minimization of the distances between the reduced convex hulls

Miguel Carrasco; Julio López; Sebastián Maldonado

Multi-class classification is an important pattern recognition task that can be addressed accurately and efficiently by Support Vector Machine (SVM). In this work we present a novel SVM-based multi-class classification approach based on the center of the configuration, a point which is equidistant to all classes. The center of the configuration is obtained from the dual formulation by minimizing the distances between the reduced convex hulls using the l1-norm, while the decision functions are subsequently constructed from this point. This work also extends the ideas of Zhou et al. (2002) [37] to multi-class classification. The use of l1-norm provides a single linear programming formulation, which reduces the complexity and confers scalability compared with other multi-class SVM methods based on quadratic programming formulations. Experiments on benchmark datasets demonstrate the virtues of our approach in terms of classification performance and running times compared with various other multi-class SVM methods. HighlightsNovel linear programming approach for multi-class SVM.A geometrically grounded method based on the concept of reduced convex hulls.Good classification performance is achieved with short running times.


Information Sciences | 2016

Multi-class second-order cone programming support vector machines

Julio López; Sebastián Maldonado

Novel multiclass approach that simultaneously constructs all required hyperplanes.Extensions of OvO and OvA multiclass SVM to second-order cone programming SVM.Best classification performance is achieved in experiments on benchmark datasets. This paper presents novel second-order cone programming (SOCP) formulations that determine a linear multi-class predictor using support vector machines (SVMs). We first extend the ideas of OvO (One-versus-One) and OvA (One-versus-All) SVM formulations to SOCP-SVM, providing two interesting alternatives to the standard SVM formulations. Additionally, we propose a novel approach (MC-SOCP) that simultaneously constructs all required hyperplanes for multi-class classification, based on the multi-class SVM formulation (MC-SVM). The use of conic constraints for each pair of training patterns in a single optimization problem provides an adequate framework for a balanced and effective prediction.


Applied Intelligence | 2016

A second-order cone programming formulation for twin support vector machines

Sebastián Maldonado; Julio López; Miguel Carrasco

Second-order cone programming (SOCP) formulations have received increasing attention as robust optimization schemes for Support Vector Machine (SVM) classification. These formulations study the worst-case setting for class-conditional densities, leading to potentially more effective classifiers in terms of performance compared to the standard SVM formulation. In this work we propose an SOCP extension for Twin SVM, a recently developed classification approach that constructs two nonparallel classifiers. The linear and kernel-based SOCP formulations for Twin SVM are derived, while the duality analysis provides interesting geometrical properties of the proposed method. Experiments on benchmark datasets demonstrate the virtues of our approach in terms of classification performance compared to alternative SVM methods.


Applied Intelligence | 2016

A novel multi-class SVM model using second-order cone constraints

Julio López; Sebastián Maldonado; Miguel Carrasco

In this work we present a novel maximum-margin approach for multi-class Support Vector Machines based on second-order cone programming. The proposed method consists of a single optimization model to construct all classification functions, in which the number of second-order cone constraints corresponds to the number of classes. This is a key difference from traditional SVM, where the number of constraints is usually related to the number of training instances. This formulation is extended further to kernel-based classification, while the duality theory provides an interesting geometric interpretation: the method finds an equidistant point between a set of ellipsoids. Experiments on benchmark datasets demonstrate the virtues of our method in terms of predictive performance compared with various other multicategory SVM approaches.


intelligent data analysis | 2015

Robust feature selection for multiclass Support Vector Machines using second-order cone programming

Julio López; Sebastián Maldonado

This work addresses the issue of high dimensionality for linear multiclass Support Vector Machines (SVMs) using second-order cone programming (SOCP) formulations. These formulations provide a robust and efficient framework for classi- fication, while an adequate feature selection process may improve predictive performance. We extend the ideas of SOCP-SVM from binary to multiclass classification, while a sequential backward elimination algorithm is proposed for variable selection, defining a contribution measure to determine the feature relevance. Experimental results with multiclass microarray datasets demonstrate the effectiveness of a low-dimensional data representation in terms of performance.


Knowledge Based Systems | 2017

Synchronized feature selection for Support Vector Machines with twin hyperplanes

Sebastián Maldonado; Julio López

Abstract In this work, a novel feature selection method for twin Support Vector Machine (SVM) is presented. The main idea is to combine two regularizers, namely the Euclidean and infinite norm to perform twin classification and variable selection simultaneously. This latter task is performed in a coordinated fashion, enabling that the same attributes are selected in each twin classifiers. A single optimization problem is used to solve both subproblems, leading to a sparse final classification rule. Experiments on low- and high-dimensional datasets indicate that our approaches present the best average performance compared to well-known feature selection strategies, also achieving a synchronized feature elimination in the two twin classifiers. Our approaches are also able to improve the performance of the twin classifier, demonstrating the importance of feature selection in high-dimensional tasks.


Applied Intelligence | 2017

Robust kernel-based multiclass support vector machines via second-order cone programming

Sebastián Maldonado; Julio López

Kernel methods are very important in pattern analysis due to their ability to capture nonlinear relationships in datasets. The best known kernel-based technique is Support Vector Machine (SVM), which can be used for several pattern recognition tasks, including multiclass classification. In this paper, we focus on maximum margin classifiers for nonlinear multiclass learning, based on second-order cone programming (SOCP), proposing three novel formulations that extend the most common strategies for this task: One-vs.-The-Rest, One-vs.-One, and All-Together optimization. The proposed SOCP formulations achieved superior performance compared to their traditional SVM counterparts on benchmark datasets, demonstrating the virtues of robust optimization.


decision support systems | 2017

Integrated framework for profit-based feature selection and SVM classification in credit scoring

Sebastián Maldonado; Cristián Bravo; Julio López; Juan Pérez

Abstract In this paper, we propose a profit-driven approach for classifier construction and simultaneous variable selection based on linear Support Vector Machines. The main goal is to incorporate business-related information such as the variable acquisition costs, the Types I and II error costs, and the profit generated by correctly classified instances, into the modeling process. Our proposal incorporates a group penalty function in the SVM formulation in order to penalize the variables simultaneously that belong to the same group, assuming that companies often acquire groups of related variables for a given cost rather than acquiring them individually. The proposed framework was studied in a credit scoring problem for a Chilean bank, and led to superior performance with respect to business-related goals.

Collaboration


Dive into the Julio López's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul Bosch

Diego Portales University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hugo Robotham

Diego Portales University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Erik Alex Papa Quiroz

National University of San Marcos

View shared research outputs
Researchain Logo
Decentralizing Knowledge