Adenilton J. da Silva
Federal University of Pernambuco
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Adenilton J. da Silva.
Neurocomputing | 2012
Adenilton J. da Silva; Wilson Rosa de Oliveira; Teresa Bernarda Ludermir
A supervised learning algorithm for quantum neural networks (QNN) based on a novel quantum neuron node implemented as a very simple quantum circuit is proposed and investigated. In contrast to the QNN published in the literature, the proposed model can perform both quantum learning and simulate the classical models. This is partly due to the neural model used elsewhere which has weights and non-linear activations functions. Here a quantum weightless neural network model is proposed as a quantisation of the classical weightless neural networks (WNN). The theoretical and practical results on WNN can be inherited by these quantum weightless neural networks (qWNN). In the quantum learning algorithm proposed here patterns of the training set are presented concurrently in superposition. This superposition-based learning algorithm (SLA) has computational cost polynomial on the number of patterns in the training set.
brazilian symposium on neural networks | 2008
Wilson Rosa de Oliveira; Adenilton J. da Silva; Teresa Bernarda Ludermir; Amanda Leonel; Wilson R. Galindo; Jefferson C. C. Pereira
Quantum analogues of the (classical) logical neural networks (LNN) models are proposed in (q-LNN for short). We shall here further develop and investigate the q-LNN composed of the quantum analogue of the probabilistic logic node (PLN) and the multiple-valued PLN (MPLN) variations, dubbed q-PLN and q-MPLN respectively. Besides a clearer mathematical description, we present a computationally efficient and simply described quantum learning algorithm in contrast to what has been proposed to the quantum weighted version.
Neural Networks | 2016
Adenilton J. da Silva; Teresa Bernarda Ludermir; Wilson Rosa de Oliveira
In this work, we propose a quantum neural network named quantum perceptron over a field (QPF). Quantum computers are not yet a reality and the models and algorithms proposed in this work cannot be simulated in actual (or classical) computers. QPF is a direct generalization of a classical perceptron and solves some drawbacks found in previous models of quantum perceptrons. We also present a learning algorithm named Superposition based Architecture Learning algorithm (SAL) that optimizes the neural network weights and architectures. SAL searches for the best architecture in a finite set of neural network architectures with linear time over the number of patterns in the training set. SAL is the first learning algorithm to determine neural network architectures in polynomial time. This speedup is obtained by the use of quantum parallelism and a non-linear quantum operator.
ibero-american conference on artificial intelligence | 2012
Tiago P. F. de Lima; Adenilton J. da Silva; Teresa Bernarda Ludermir
This paper explores the automatic construction of multi-classifiers systems based on a combination of selection and fusion. The method proposed is composed by two phases: one for designing the individual classifiers and one for clustering patterns of training set and search a set of classifiers for each cluster found. In our experiments, we adopted the artificial neural networks in the classification phase and self-organizing maps in clustering phase. Differential evolution with global and local neighborhoods has been used in this work in order to optimize the parameters and performance of the techniques used in classification and clustering phases. The experimental results have shown that the proposed method has better performance than manual methods and significantly outperforms most of the methods commonly used to combine multiple classifiers for a set of 4 benchmark problems.
ibero-american conference on artificial intelligence | 2010
Adenilton J. da Silva; Nicole L. Mineu; Teresa Bernarda Ludermir
One of the main problems in the training of artificial neural networks is to define their initial weights and architecture. The use of evolutionary algorithms (EAs) to optimize artificial neural networks has been largely used because the EAs can deal with large, non-differentiable, complex and multimodal spaces, and because they are good in finding the optimal region. In this paper we propose the use of Adaptive Differential Evolution (JADE), a new evolutionary algorithm based in the differential evolution (DE), to deal with this problem of training neural networks. Experiments were performed to evaluate the proposed method using machine learning benchmarks for classification problems.
brazilian symposium on neural networks | 2010
Adenilton J. da Silva; Wilson Rosa de Oliveira; Teresa Bernarda Ludermir
The success of quantum computation is most commonly associated with speed up of classical algorithms, as the Shors factoring algorithm and the Grovers search algorithm. But it should also be related with exponential storage capacity such as the super dense coding. In this work we use a probabilistic quantum memory proposed by Trugen berger, where one can store
brazilian symposium on neural networks | 2010
Adenilton J. da Silva; Teresa Bernarda Ludermir; Wilson Rosa de Oliveira
\mathbf 2^n
Neurocomputing | 2016
Fernando Moraes Neto; Wilson Rosa de Oliveira; Adenilton J. da Silva; Teresa Bernarda Ludermir
patterns with only
Neurocomputing | 2016
Adenilton J. da Silva; Wilson Rosa de Oliveira; Teresa Bernarda Ludermir
\mathbf n
international symposium on neural networks | 2012
Tiago P. F. de Lima; Adenilton J. da Silva; Teresa Bernarda Ludermir
quantum bits (qbits). We define a new model of a quantum weightless neural node with this memory in a similar fashion as to the classical Random Access Memory (RAM) node is used in classical weightless neural networks. Some advantages of the proposed model are that the memory of the node does not grow exponentially with the number of inputs and that the node can generalise.