Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Albert Hung-Ren Ko is active.

Publication


Featured researches published by Albert Hung-Ren Ko.


Pattern Recognition | 2008

From dynamic classifier selection to dynamic ensemble selection

Albert Hung-Ren Ko; Robert Sabourin; Alceu de Souza Britto

In handwritten pattern recognition, the multiple classifier system has been shown to be useful for improving recognition rates. One of the most important tasks in optimizing a multiple classifier system is to select a group of adequate classifiers, known as an Ensemble of Classifiers (EoC), from a pool of classifiers. Static selection schemes select an EoC for all test patterns, and dynamic selection schemes select different classifiers for different test patterns. Nevertheless, it has been shown that traditional dynamic selection performs no better than static selection. We propose four new dynamic selection schemes which explore the properties of the oracle concept. Our results suggest that the proposed schemes, using the majority voting rule for combining classifiers, perform better than the static selection method.


Pattern Recognition | 2007

Pairwise fusion matrix for combining classifiers

Albert Hung-Ren Ko; Robert Sabourin; Alceu de Souza Britto; Luiz S. Oliveira

Various fusion functions for classifier combination have been designed to optimize the results of ensembles of classifiers (EoC). We propose a pairwise fusion matrix (PFM) transformation, which produces reliable probabilities for the use of classifier combination and can be amalgamated with most existent fusion functions for combining classifiers. The PFM requires only crisp class label outputs from classifiers, and is suitable for high-class problems or problems with few training samples. Experimental results suggest that the performance of a PFM can be a notch above that of the simple majority voting rule (MAJ), and a PFM can work on problems where a behavior-knowledge space (BKS) might not be applicable.


international joint conference on neural network | 2006

Combining Diversity and Classification Accuracy for Ensemble Selection in Random Subspaces

Albert Hung-Ren Ko; Robert Sabourin; A. de Souza Britt

An ensemble of classifiers has been shown to be effective in improving classifier performance. Two elements are believed to be viable in constructing an ensemble: a) the classification accuracy of each individual classifier; and b) diversity among the classifiers. Nevertheless, most works based on diversity suggest that there exists only weak correlation between diversity and ensemble accuracy. We propose compound diversity functions which combine the diversities with the classification accuracy of each individual classifier, and show that with Random subspaces ensemble creation method, there is a strong correlation between the proposed functions and ensemble accuracy. The statistical result indicates that compound diversity functions perform better than traditional diversity measures.


International Journal of Pattern Recognition and Artificial Intelligence | 2009

COMPOUND DIVERSITY FUNCTIONS FOR ENSEMBLE SELECTION

Albert Hung-Ren Ko; Robert Sabourin; Alceu De Souza Britto

An effective way to improve a classification methods performance is to create ensembles of classifiers. Two elements are believed to be important in constructing an ensemble: (a) the performance of each individual classifier; and (b) diversity among the classifiers. Nevertheless, most works based on diversity suggest that there exists only weak correlation between classifier performance and ensemble accuracy. We propose compound diversity functions which combine the diversities with the performance of each individual classifier, and show that there is a strong correlation between the proposed functions and ensemble accuracy. Calculation of the correlations with different ensemble creation methods, different problems and different classification algorithms on 0.624 million ensembles suggests that most compound diversity functions are better than traditional diversity measures. The population-based Genetic Algorithm was used to search for the best ensembles on a handwritten numerals recognition problem and to evaluate 42.24 million ensembles. The statistical results indicate that compound diversity functions perform better than traditional diversity measures, and are helpful in selecting the best ensembles.


Pattern Analysis and Applications | 2009

Ensemble of HMM classifiers based on the clustering validity index for a handwritten numeral recognizer

Albert Hung-Ren Ko; Robert Sabourin; Alceu de Souza Britto

A new scheme for the optimization of codebook sizes for Hidden Markov Models (HMMs) and the generation of HMM ensembles is proposed in this paper. In a discrete HMM, the vector quantization procedure and the generated codebook are associated with performance degradation. By using a selected clustering validity index, we show that the optimization of HMM codebook size can be selected without training HMM classifiers. Moreover, the proposed scheme yields multiple optimized HMM classifiers, and each individual HMM is based on a different codebook size. By using these to construct an ensemble of HMM classifiers, this scheme can compensate for the degradation of a discrete HMM.


genetic and evolutionary computation conference | 2006

Evolving ensemble of classifiers in random subspace

Albert Hung-Ren Ko; Robert Sabourin; Alceu de Souza Britto

Various methods for ensemble selection and classifier combination have been designed to optimize the results of ensembles of classifiers. Genetic algorithm (GA) which uses the diversity for the ensemble selection could be very time consuming. We propose compound diversity functions as objective functions for a faster and more effective GA searching. Classifiers selected by GA are combined by a proposed pairwise confusion matrix transformation, which offer strong performance boost for EoCs.


international conference on multiple classifier systems | 2007

A new HMM-based ensemble generation method for numeral recognition

Albert Hung-Ren Ko; Robert Sabourin; Alceu de Souza Britto

A new scheme for the optimization of codebook sizes for HMMs and the generation of HMM ensembles is proposed in this paper. In a discrete HMM, the vector quantization procedure and the generated codebook are associated with performance degradation. By using a selected clustering validity index, we show that the optimization of HMM codebook size can be selected without training HMM classifiers. Moreover, the proposed scheme yields multiple optimized HMM classifiers, and each individual HMM is based on a different codebook size. By using these to construct an ensemble of HMM classifiers, this scheme can compensate for the degradation of a discrete HMM.


international conference on pattern recognition | 2008

The implication of data diversity for a classifier-free ensemble selection in random subspaces

Albert Hung-Ren Ko; Robert Sabourin; L.E.S. de Oliveira; A. de Souza Britto

Ensemble of Classifiers (EoC) has been shown effective in improving the performance of single classifiers by combining their outputs. By using diverse data subsets to train classifiers, the ensemble creation methods can create diverse classifiers for the EoC. In this work, we propose a scheme to measure the data diversity directly from random subspaces and we explore the possibility of using the data diversity directly to select the best data subsets for the construction of the EoC. The applicability is tested on NIST SD19 handwritten numerals.


Expert Systems With Applications | 2013

Performance of distributed multi-agent multi-state reinforcement spectrum management using different exploration schemes

Albert Hung-Ren Ko; Robert Sabourin; François Gagnon

This paper introduces a novel multi-agent multi-state reinforcement learning exploration scheme for dynamic spectrum access and dynamic spectrum sharing in wireless communications. With the multi-agent multi-state reinforcement learning, cognitive radios can decide the best channels to use in order to maximize spectral efficiency in a distributed way. However, we argue that the performance of spectrum management, including both dynamic spectrum access and dynamic spectrum sharing, will largely depend on different reinforcement learning exploration schemes, and we believe that the traditional multi-agent multi-state reinforcement learning exploration schemes may not be adequate in the context of spectrum management. We then propose a novel reinforcement learning exploration scheme and show that we can improve the performance of multi-agent multi-state reinforcement learning based spectrum management by using the proposed reinforcement learning exploration scheme. We also investigate various real-world scenarios, and confirm the validity of the proposed method.


international conference on document analysis and recognition | 2007

K-Nearest Oracle for Dynamic Ensemble Selection

Albert Hung-Ren Ko; Robert Sabourin; A. de Souza Britto

For handwritten pattern recognition, multiple classifier system has been shown to be useful in improving recognition rates. One of the most important issues to optimize a multiple classifier system is to select a group of adequate classifiers, known as ensemble of classifiers (EoC), from a pool of classifiers. Static selection schemes select an EoC for all test patterns, and dynamic selection schemes select different classifiers for different test patterns. Nevertheless, it has been shown that traditional dynamic selection does not give better performance than static selection. We propose four new dynamic selection schemes which explore the property of the oracle concept. The result suggests that the proposed schemes are apparently better than the static selection using the majority voting rule for combining classifiers.

Collaboration


Dive into the Albert Hung-Ren Ko's collaboration.

Top Co-Authors

Avatar

Robert Sabourin

École de technologie supérieure

View shared research outputs
Top Co-Authors

Avatar

Alceu de Souza Britto

Pontifícia Universidade Católica do Paraná

View shared research outputs
Top Co-Authors

Avatar

François Gagnon

École de technologie supérieure

View shared research outputs
Top Co-Authors

Avatar

Luiz S. Oliveira

Federal University of Paraná

View shared research outputs
Top Co-Authors

Avatar

A. de Souza Britto

Pontifícia Universidade Católica do Paraná

View shared research outputs
Top Co-Authors

Avatar

Patrick Maupin

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar

Alceu De Souza Britto

Pontifícia Universidade Católica do Paraná

View shared research outputs
Top Co-Authors

Avatar

Alceu S. Britto

Pontifícia Universidade Católica do Paraná

View shared research outputs
Top Co-Authors

Avatar

Alessandro L. Koerich

Pontifícia Universidade Católica do Paraná

View shared research outputs
Researchain Logo
Decentralizing Knowledge