Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Federico Lecumberry is active.

Publication


Featured researches published by Federico Lecumberry.


IEEE Transactions on Image Processing | 2010

Simultaneous Object Classification and Segmentation With High-Order Multiple Shape Models

Federico Lecumberry; Alvaro Pardo; Guillermo Sapiro

Shape models (SMs), capturing the common features of a set of training shapes, represent a new incoming object based on its projection onto the corresponding model. Given a set of learned SMs representing different objects classes, and an image with a new shape, this work introduces a joint classification-segmentation framework with a twofold goal. First, to automatically select the SM that best represents the object, and second, to accurately segment the image taking into account both the image information and the features and variations learned from the online selected model. A new energy functional is introduced that simultaneously accomplishes both goals. Model selection is performed based on a shape similarity measure, online determining which model to use at each iteration of the steepest descent minimization, allowing for model switching and adaptation to the data. High-order SMs are used in order to deal with very similar object classes and natural variability within them. Position and transformation invariance is included as part of the modeling as well. The presentation of the framework is complemented with examples for the difficult task of simultaneously classifying and segmenting closely related shapes, such as stages of human activities, in images with severe occlusions.


ieee international workshop on computational advances in multi sensor adaptive processing | 2009

Universal priors for sparse modeling

Ignacio Ramirez; Federico Lecumberry; Guillermo Sapiro

Sparse data models, where data is assumed to be well represented as a linear combination of a few elements from a dictionary, have gained considerable attention in recent years, and their use has led to state-of-the-art results in many signal and image processing tasks. It is now well understood that the choice of the sparsity regularization term is critical in the success of such models. In this work, we use tools from information theory to propose a sparsity regularization term which has several theoretical and practical advantages over the more standard ¿0 or ¿1 ones, and which leads to improved coding performance and accuracy in reconstruction tasks. We also briefly report on further improvements obtained by imposing low mutual coherence and Gram matrix norm on the learned dictionaries.


Pattern Recognition Letters | 2013

Novel classifier scheme for imbalanced problems

Matías Di Martino; Alicia Fernández; Pablo A. Iturralde; Federico Lecumberry

There is an increasing interest in the design of classifiers for imbalanced problems due to their relevance in many fields, such as fraud detection and medical diagnosis. In this work we present a new classifier developed specially for imbalanced problems, where maximum F-measure instead of maximum accuracy guide the classifier design. Theoretical basis, algorithm description and real experiments are presented. The algorithm proposed shows suitability and a very good performance in imbalance scenarios and high overlapping between classes.


british machine vision conference | 2006

Constrained Anisotropic Diffusion and some Applications.

Gabriele Facciolo; Federico Lecumberry; Andrés Almansa; Alvaro Pardo; Vicent Caselles; Bernard Rougé

Minimal surface regularization has been used in several applications ranging from stereo to image segmentation, sometimes hidden as a graph-cut discrete formulation, or as a strictly convex approximation to TV minimization. In this paper we consider a modified version of minimal surfac e regularization coupled with a robust data fitting term for interpolatio n purposes, where the corresponding evolution equation is constrained to diffuse only along the isophotes of a given image u and we design a convergent numerical scheme to accomplish this. To illustrate the usefulness of our appr oach, we apply this framework to the digital elevation model interpolatio n and to constrained vector probability diffusion.


BMC Bioinformatics | 2015

Beef quality parameters estimation using ultrasound and color images

José Pedro L Nunes; Martín Piquerez; Leonardo Pujadas; Eileen Armstrong; Alicia Fernández; Federico Lecumberry

BackgroundBeef quality measurement is a complex task with high economic impact. There is high interest in obtaining an automatic quality parameters estimation in live cattle or post mortem. In this paper we set out to obtain beef quality estimates from the analysis of ultrasound (in vivo) and color images (post mortem), with the measurement of various parameters related to tenderness and amount of meat: rib eye area, percentage of intramuscular fat and backfat thickness or subcutaneous fat.ProposalAn algorithm based on curve evolution is implemented to calculate the rib eye area. The backfat thickness is estimated from the profile of distances between two curves that limit the steak and the rib eye, previously detected. A model base in Support Vector Regression (SVR) is trained to estimate the intramuscular fat percentage. A series of features extracted on a region of interest, previously detected in both ultrasound and color images, were proposed. In all cases, a complete evaluation was performed with different databases including: color and ultrasound images acquired by a beef industry expert, intramuscular fat estimation obtained by an expert using a commercial software, and chemical analysis.ConclusionsThe proposed algorithms show good results to calculate the rib eye area and the backfat thickness measure and profile. They are also promising in predicting the percentage of intramuscular fat.


Molecular and Cellular Probes | 2014

A confocal microscopy image analysis method to measure adhesion and internalization of Pseudomonas aeruginosa multicellular structures into epithelial cells

Paola Lepanto; Federico Lecumberry; Jéssica Rossello; Arlinet Kierbel

Formation of multicellular structures such as biofilms is an important feature in the physiopathology of many disease-causing bacteria. We recently reported that Pseudomonas aeruginosa adheres to epithelial cells rapidly forming early biofilm-like aggregates, which can then be internalized into cells. Conventional methods to measure adhesion/internalization, such as dilution plating for total cell-associated or antibiotic protected bacteria, do not distinguish between single and aggregated bacteria. We report a procedure that combining double bacteria labeling, confocal microscopy and image analysis allows identification and quantification of the number of adhered and internalized bacteria distinguishing between single and aggregated bacterial cells. A plugin for Fiji to automatically perform these procedures has been generated.


2007 IEEE Workshop on Automatic Identification Advanced Technologies | 2007

Aguará: An Improved Face Recognition Algorithm through Gabor Filter Adaptation

Cecilia Aguerrebere; Germán Capdehourat; Mauricio Delbracio; Matías Mateu; Alicia Fernández; Federico Lecumberry

We developed an EBGM-based algorithm that successfully implements face recognition under constrained conditions. A suitable adaptation of the Gabor filters was found through a power spectral analysis (PSD) of the face images. We outperformed the best-known implementations of the EBGM algorithm in the FERET database. The results are comparable with those of the state of the art.


iberoamerican congress on pattern recognition | 2015

Optimal and Linear F-Measure Classifiers Applied to Non-technical Losses Detection

Fernanda Rodríguez; Matías Di Martino; Juan Pablo Kosut; Fernando Santomauro; Federico Lecumberry; Alicia Fernández

Non-technical loss detection represents a very high cost to power supply companies. Finding classifiers that can deal with this problem is not easy as they have to face a high imbalance scenario with noisy data. In this paper we propose to use Optimal F-measure Classifier (OFC) and Linear F-measure Classifier (LFC), two novel algorithms that are designed to work in problems with unbalanced classes. We compare both algorithm performances with other previously used methods to solve automatic fraud detection problem.


international conference on pattern recognition applications and methods | 2014

An a-contrario Approach for Face Matching

Luis D. Di Martino; Javier Preciozzi; Federico Lecumberry; Alicia Fernández

In this work we focused in the matching stage of a face recognition system. These systems are used to identify an unknown person or to validate a claimed identity. In the face recognition field it is very common to innovate in the extracted features of a face and use a simple threshold on the distance between samples in order to perform the validation of a claimed identity. In this work we present a novel strategy based in the a-contrario framework in order to improve the matching stage. This approach results in a validation threshold that is automatically adapted to the data and allows to predict the performance of the system in advance. We perform several experiments in order to validate this novel strategy using different databases and show its advantages over using a simple threshold over the distances.


iberoamerican congress on pattern recognition | 2014

Semisupervised Approach to Non Technical Losses Detection

Juan Tacón; Damián Melgarejo; Fernanda Rodríguez; Federico Lecumberry; Alicia Fernández

Non-technical electrical losses detection is a complex task, with high economic impact. Due to the diversity and large number of consumption records, it is very important to find an efficient automatic method to detect the largest number of frauds with the least amount of experts’ hours involved in preprocessing and inspections. This article analyzes the performance of a strategy based on a semisupervised method, that starting from a set of labeled data, extends this labels to unlabeled data, and then allows to detect new frauds at consumptions. Results show that the proposed framework, improves performance in terms of the F measure against manual methods performed by experts and previous supervised methods, avoiding hours of experts/inspection labeling.

Collaboration


Dive into the Federico Lecumberry's collaboration.

Top Co-Authors

Avatar

Alicia Fernández

University of the Republic

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Javier Preciozzi

University of the Republic

View shared research outputs
Top Co-Authors

Avatar

Julian Oreggioni

University of the Republic

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Leonardo Steinfeld

Universidade Federal do Rio Grande do Sul

View shared research outputs
Researchain Logo
Decentralizing Knowledge