Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Diego Hernán Peluffo-Ordóñez is active.

Publication


Featured researches published by Diego Hernán Peluffo-Ordóñez.


Computer Methods and Programs in Biomedicine | 2012

Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering

José Luis Rodríguez-Sotelo; Diego Hernán Peluffo-Ordóñez; David Cuesta-Frau; Germán Castellanos-Domínguez

The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes.


computational intelligence and data mining | 2014

Generalized kernel framework for unsupervised spectral methods of dimensionality reduction

Diego Hernán Peluffo-Ordóñez; John Aldo Lee; Michel Verleysen

This work introduces a generalized kernel perspective for spectral dimensionality reduction approaches. Firstly, an elegant matrix view of kernel principal component analysis (PCA) is described. We show the relationship between kernel PCA, and conventional PCA using a parametric distance. Secondly, we introduce a weighted kernel PCA framework followed from least-squares support vector machines (LS-SVM). This approach starts with a latent variable that allows to write a relaxed LS-SVM problem. Such a problem is addressed by a primal-dual formulation. As a result, we provide kernel alternatives to spectral methods for dimensionality reduction such as multidimensional scaling, locally linear embedding, and laplacian eigenmaps; as well as a versatile framework to explain weighted PCA approaches. Experimentally, we prove that the incorporation of a SVM model improves the performance of kernel PCA.


Advances in Self-Organizing Maps and Learning Vector Quantization | 2014

Short review of dimensionality reduction methods based on stochastic neighbour embedding

Diego Hernán Peluffo-Ordóñez; John Aldo Lee; Michel Verleysen

Dimensionality reduction methods aimed at preserving the data topology have shown to be suitable for reaching high-quality embedded data. In particular, those based on divergences such as stochastic neighbour embedding (SNE). The big advantage of SNE and its variants is that the neighbor preservation is done by optimizing the similarities in both high- and low-dimensional space. This work presents a brief review of SNE-based methods. Also, a comparative analysis of the considered methods is provided, which is done on important aspects such as algorithm implementation, relationship between methods, and performance. The aim of this paper is to investigate recent alternatives to SNE as well as to provide substantial results and discussion to compare them.


ibero-american conference on artificial intelligence | 2012

Image Segmentation Based on Multi-Kernel Learning and Feature Relevance Analysis

Santiago Molina-Giraldo; Andrés Marino Álvarez-Meza; Diego Hernán Peluffo-Ordóñez; Germán Castellanos-Domínguez

In this paper an automatic image segmentation methodology based on Multiple Kernel Learning (MKL) is proposed. In this regard, we compute some image features for each input pixel, and then combine such features by means of a MKL framework. We automatically fix the weights of the MKL approach based on a relevance analysis over the original input feature space. Moreover, an unsupervised image segmentation measure is used as a tool to establish the employed kernel free parameter. A Kernel Kmeans algorithm is used as spectral clustering method to segment a given image. Experiments are carried out aiming to test the efficiency of the incorporation of weighted feature information into clustering procedure, and to compare the performance against state of the art algorithms, using a supervised image segmentation measure. Attained results show that our approach is able to compute a meaningful segmentations, demonstrating its capability to support further vision computer applications.


distributed computing and artificial intelligence | 2015

Bridging the gap between human knowledge and machine learning

Juan C. Alvarado-Pérez; Diego Hernán Peluffo-Ordóñez; Roberto Therón

Nowadays, great amount of data is being created by several sources from academic, scientific, business and industrial activities. Such data intrinsically contains meaningful information allowing for developing techniques, and have scientific validity to explore the information thereof. In this connection, the aim of artificial intelligence (AI) is getting new knowledge to make decisions properly. AI has taken an important place in scientific and technology development communities, and recently develops computer-based processing devices for modern machines. Under the premise, the premise that the feedback provided by human reasoning -which is holistic, flexible and parallel- may enhance the data analysis, the need for the integration of natural and artificial intelligence has emerged. Such an integration makes the process of knowledge discovery more effective, providing the ability to easily find hidden trends and patterns belonging to the database predictive model. As well, allowing for new observations and considerations from beforehand known data by using both data analysis methods and knowledge and skills from human reasoning. In this work, we review main basics and recent works on artificial and natural intelligence integration in order to introduce users and researchers on this emergent field. As well, key aspects to conceptually compare them are provided.


international symposium on neural networks | 2013

Kernel spectral clustering for dynamic data using multiple kernel learning

Diego Hernán Peluffo-Ordóñez; Sergio García-Vega; Rocco Langone; Johan A. K. Suykens; Germán Castellanos-Domínguez

In this paper we propose a kernel spectral clustering-based technique to catch the different regimes experienced by a time-varying system. Our method is based on a multiple kernel learning approach, which is a linear combination of kernels. The calculation of the linear combination coefficients is done by determining a ranking vector that quantifies the overall dynamical behavior of the analyzed data sequence over-time. This vector can be calculated from the eigenvectors provided by the the solution of the kernel spectral clustering problem. We apply the proposed technique to a trial from the Graphics Lab Motion Capture Database from Carnegie Mellon University, as well as to a synthetic example, namely three moving Gaussian clouds. For comparison purposes, some conventional spectral clustering techniques are also considered, namely, kernel k-means and min-cuts. Also, standard k-means. The normalized mutual information and adjusted random index metrics are used to quantify the clustering performance. Results show the usefulness of proposed technique to track dynamic data, even being able to detect hidden objects.


2015 20th Symposium on Signal Processing, Images and Computer Vision (STSIVA) | 2015

Interactive interface for efficient data visualization via a geometric approach

Jose Alejandro Salazar-Castro; Y. C. Rosas-Narváez; A. D. Pantoja; Juan C. Alvarado-Pérez; Diego Hernán Peluffo-Ordóñez

Dimensionality reduction (DR) methods represent a suitable alternative to visualizing data. Nonetheless, most of them still lack the properties of interactivity and controllability. In this work, we propose a data visualization interface that allows for user interaction within an interactive framework. Specifically, our interface is based on a mathematic geometric model, which combines DR methods through a weighted sum. Interactivity is provided in the sense that weighting factors are given by the user via the selection of points inside a geometric surface. Then, (even non-expert) users can intuitively either select a concrete DR method or carry out a mixture of methods. Experimental results are obtained using artificial and real datasets, demonstrating the usability and applicability of our interface in DR-based data visualization.


intelligent data engineering and automated learning | 2017

Interactive Data Visualization Using Dimensionality Reduction and Dissimilarity-Based Representations

Diego F. Peña-Unigarro; Paul Rosero-Montalvo; Edgardo Javier Revelo-Fuelagán; J. A. Castro-Silva; Juan C. Alvarado-Pérez; Roberto Therón; C. M. Ortega-Bustamante; Diego Hernán Peluffo-Ordóñez

This work describes a new model for interactive data visualization followed from a dimensionality-reduction (DR)-based approach. Particularly, the mixture of the resulting spaces of DR methods is considered, which is carried out by a weighted sum. For the sake of user interaction, corresponding weighting factors are given via an intuitive color-based interface. Also, to depict the DR outcomes while showing information about the input high-dimensional data space, the low-dimensional representations reached by the mixture is conveyed using scatter plots enhanced with an interactive data-driven visualization. In this connection, a constrained dissimilarity approach define the graph to be drawn on the scatter plot.


2016 XXI Symposium on Signal Processing, Images and Artificial Vision (STSIVA) | 2016

Interactive visualization methodology of high-dimensional data with a color-based model for dimensionality reduction

Diego F. Peña-Unigarro; Jose Alejandro Salazar-Castro; Diego Hernán Peluffo-Ordóñez; Paul Rosero-Montalvo; Omar R. Ona-Rocha; Andres A. Isaza; Juan C. Alvarado-Pérez; Roberto Therón

Nowadays, a consequence of data overload is that worlds technology capacity to collect, communicate, and store large volumes of data is increasing faster than human analysis skills. Such an issue has motivated the development of graphic ways to visually represent and analyze high-dimensional data. Particularly, in this work, we propose a graphical interface that allow the combination of dimensionality reduction (DR) methods using a chromatic model to make data visualization more intelligible for humans. This interface is designed for an easy and interactive use, so that input parameters are given by the user via the selection of RGB values inside a given surface. Proposed interface enables (even non-expert) users to intuitively either select a concrete DR method or carry out a mixture of methods. Experimental results proves the usability of our interface making the selection or configuration of a DR-based visualization an intuitive and interactive task for the user.


iberoamerican congress on pattern recognition | 2015

Multiple Kernel Learning for Spectral Dimensionality Reduction

Diego Hernán Peluffo-Ordóñez; Andrés Eduardo Castro-Ospina; Juan C. Alvarado-Pérez; Edgardo Javier Revelo-Fuelagán

This work introduces a multiple kernel learning (MKL) approach for selecting and combining different spectral methods of dimensionality reduction (DR). From a predefined set of kernels representing conventional spectral DR methods, a generalized kernel is calculated by means of a linear combination of kernel matrices. Coefficients are estimated via a variable ranking aimed at quantifying how much each variable contributes to optimize a variance preservation criterion. All considered kernels are tested within a kernel PCA framework. The experiments are carried out over well-known real and artificial data sets. The performance of compared DR approaches is quantified by a scaled version of the average agreement rate between K-ary neighborhoods. Proposed MKL approach exploits the representation ability of every single method to reach a better embedded data for both getting more intelligible visualization and preserving the structure of data.

Collaboration


Dive into the Diego Hernán Peluffo-Ordóñez's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Cuesta-Frau

Polytechnic University of Valencia

View shared research outputs
Researchain Logo
Decentralizing Knowledge