Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where César Lincoln C. Mattos is active.

Publication


Featured researches published by César Lincoln C. Mattos.


Neural Computing and Applications | 2013

ARTIE and MUSCLE models: building ensemble classifiers from fuzzy ART and SOM networks

César Lincoln C. Mattos; Guilherme A. Barreto

Ensemble Learning has proven to be an efficient method to improve the performance of single classifiers. In this context, the present article introduces ARTIE (ART networks in Ensembles) and MUSCLE (Multiple SOM Classifiers in Ensembles), two novel ensemble models that use Fuzzy ART and SOM networks as base classifiers, respectively. In addition, a hybrid metaheuristic solution based on Particle Swarm Optimization and Simulated Annealing is used for parameter tuning of the base classifiers. A comprehensive performance comparison using 10 benchmarking data sets indicates that the ARTIE and MUSCLE architectures consistently outperform ensembles built from standard supervised neural networks, such as the Fuzzy ARTMAP, Learning Vector Quantization, and the Extreme Learning Machine.


Electronic Commerce Research | 2014

An improved hybrid particle swarm optimization algorithm applied to economic modeling of radio resource allocation

César Lincoln C. Mattos; Guilherme A. Barreto; Francisco Rodrigo P. Cavalcanti

An operational economic model for radio resource allocation in the downlink of a multi-cell WCDMA (acronym for wideband code division multiple access). system is developed in this paper, and a particle swarm optimization (PSO) based approach is proposed for its solution. Firstly, we develop an economic model for resource allocation that considers the utility of the provided service, the acceptance probability of the service by the users and the revenue generated for the network operator. Then, we introduce a constrained hybrid PSO algorithm, called improved hybrid particle swarm optimization (I-HPSO), in order to find feasible solutions to the problem. We compare the performance of the I-HPSO algorithm with those achieved by the original HPSO algorithm and by standard metaheuristic optimization techniques, such as hill climbing, simulated annealing, standard PSO and genetic algorithms. The obtained results indicate that the proposed approach achieves superior performance than the conventional techniques.


intelligent data engineering and automated learning | 2015

An Empirical Evaluation of Robust Gaussian Process Models for System Identification

César Lincoln C. Mattos; Jose Santos; Guilherme A. Barreto

System identification comprises a number of linear and nonlinear tools for black-box modeling of dynamical systems, with applications in several areas of engineering, control, biology and economy. However, the usual Gaussian noise assumption is not always satisfied, specially if data is corrupted by impulsive noise or outliers. Bearing this in mind, the present paper aims at evaluating how Gaussian Process (GP) models perform in system identification tasks in the presence of outliers. More specifically, we compare the performances of two existing robust GP-based regression models in experiments involving five benchmarking datasets with controlled outlier inclusion. The results indicate that, although still sensitive in some degree to the presence of outliers, the robust models are indeed able to achieve lower prediction errors in corrupted scenarios when compared to conventional GP-based approach.


international work-conference on artificial and natural neural networks | 2015

Performance Evaluation of Least Squares SVR in Robust Dynamical System Identification

Jose Santos; César Lincoln C. Mattos; Guilherme A. Barreto

Least Squares Support Vector Regression (LS-SVR) is a powerful kernel-based learning tool for regression problems. Nonlinear system identification is one of such problems where we aim at capturing the behavior in time of a dynamical system by building a black-box model from the measured input-output time series. Besides the difficulties involved in the specification a suitable model itself, most real-world systems are subject to the presence of outliers in the observations. Hence, robust methods that can handle outliers suitably are desirable. In this regard, despite the existence of a few previous works on robustifying the LS-SVR for regression applications with outliers, its use for dynamical system identification has not been fully evaluated yet. Bearing this in mind, in this paper we assess the performances of two existing robust LS-SVR variants, namely WLS-SVR and RLS-SVR, in nonlinear system identification tasks containing outliers. These robust approaches are compared with standard LS-SVR in experiments with three artificial datasets, whose outputs are contaminated with different amounts of outliers, and a real-world benchmarking dataset. The obtained results for infinite step ahead prediction confirm that the robust LS-SVR variants consistently outperforms the standard LS-SVR algorithm.


international conference on artificial neural networks | 2014

Improved Adaline Networks for Robust Pattern Classification

César Lincoln C. Mattos; Jose Santos; Guilherme A. Barreto

The Adaline network [1] is a classic neural architecture whose learning rule is the famous least mean squares (LMS) algorithm (a.k.a. delta rule or Widrow-Hoff rule). It has been demonstrated that the LMS algorithm is optimal in H ∞ sense since it tolerates small (in energy) disturbances, such as measurement noise, parameter drifting and modelling errors [2,3]. Such optimality of the LMS algorithm, however, has been demonstrated for regression-like problems only, not for pattern classification. Bearing this in mind, we firstly show that the performances of the LMS algorithm and variants of it (including the recent Kernel LMS algorithm) in pattern classification tasks deteriorates considerably in the presence of labelling errors, and then introduce robust extensions of the Adaline network that can deal efficiently with such errors. Comprehensive computer simulations show that the proposed extension consistently outperforms the original version.


workshop on self organizing maps | 2017

Metaheuristic optimization for automatic clustering of customer-oriented supply chain data

César Lincoln C. Mattos; Guilherme A. Barreto; Dennis Horstkemper; Bernd Hellingrath

In this paper we evaluate metaheuristic optimization methods on a partitional clustering task of a real-world supply chain dataset, aiming at customer segmentation. For this purpose, we rely on the automatic clustering framework proposed by Das et al. [1], named henceforth DAK framework, by testing its performance for seven different metaheuristic optimization algorithm, namely: simulated annealing (SA), genetic algorithms (GA), particle swarm optimization (PSO), differential evolution (DE), artificial bee colony (ABC), cuckoo search (CS) and fireworks algorithm (FA). An in-depth analysis of the obtained results is carried out in order to compare the performances of the metaheuristic optimization algorithms under the DAK framework with that of standard (i.e. non-automatic) clustering methodology.


international work-conference on artificial and natural neural networks | 2017

Randomized Neural Networks for Recursive System Identification in the Presence of Outliers: A Performance Comparison.

César Lincoln C. Mattos; Guilherme A. Barreto; Gonzalo Acuña

In this paper, randomized single-hidden layer feedforward networks (SLFNs) are extended to handle outliers sequentially in online system identification tasks involving large-scale datasets. Starting from the description of the original batch learning algorithms of the evaluated randomized SLFNs, we discuss how these neural architectures can be easily adapted to cope with sequential data by means of the famed least mean squares (LMS). In addition, a robust variant of this rule, known as the least mean M-estimate (LMM) rule, is used to cope with outliers. Comprehensive performance comparison on benchmarking datasets are carried out in order to assess the validity of the proposed methodology.


intelligent data engineering and automated learning | 2014

A Novel Recursive Kernel-Based Algorithm for Robust Pattern Classification

Jose Santos; César Lincoln C. Mattos; Guilherme A. Barreto

Kernel methods comprise a class of machine learning algorithms that utilize Mercer kernels for producing nonlinear versions of conventional linear learning algorithms. This kernelizing approach has been applied, for example, to the famed least mean squares (LMS) [1] algorithm to give rise to the kernel least mean squares (KLMS) algorithm [2]. However, a major drawback of the LMS algorithm (and also of its kernelized version) is the performance degradation in scenarios with outliers. Bearing this in mind, we introduce instead a kernel classifier based on the least mean M-estimate (LMM) algorithm [3] which is a robust variant of the LMS algorithm based on M-estimation techniques. The proposed Kernel LMM (KLMM) algorithm is evaluated in pattern classification tasks with outliers using both synthetic and real-world datasets. The obtained results indicate the superiority of the proposed approach over the standard KLMS algorithm.


international conference on learning representations | 2016

Recurrent Gaussian Processes

César Lincoln C. Mattos; Zhenwen Dai; Andreas C. Damianou; Jeremy Forth; Guilherme A. Barreto; Neil D. Lawrence


IFAC-PapersOnLine | 2016

Latent Autoregressive Gaussian Processes Models for Robust System Identification

César Lincoln C. Mattos; Andreas C. Damianou; Guilherme A. Barreto; Neil D. Lawrence

Collaboration


Dive into the César Lincoln C. Mattos's collaboration.

Top Co-Authors

Avatar

Guilherme A. Barreto

Federal University of Ceará

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge