Stefano Fanelli
University of Rome Tor Vergata
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stefano Fanelli.
IEEE Transactions on Neural Networks | 2003
A. Bortoletti; C. Di Fiore; Stefano Fanelli; Paolo Zellini
In this paper, we present a new class of quasi-Newton methods for an effective learning in large multilayer perceptron (MLP)-networks. The algorithms introduced in this work, named LQN, utilize an iterative scheme of a generalized BFGS-type method, involving a suitable family of matrix algebras L. The main advantages of these innovative methods are based upon the fact that they have an O(nlogn) complexity per step and that they require O(n) memory allocations. Numerical experiences, performed on a set of standard benchmarks of MLP-networks, show the competitivity of the LQN methods, especially for large values of n.
Numerische Mathematik | 2003
Carmine Di Fiore; Stefano Fanelli; Filomena Lepore; Paolo Zellini
Summary. In this paper a new class of quasi-Newton methods, named ℒQN, is introduced in order to solve unconstrained minimization problems. The novel approach, which generalizes classical BFGS methods, is based on a Hessian updating formula involving an algebra ℒ of matrices simultaneously diagonalized by a fast unitary transform. The complexity per step of ℒQN methods is O(n log n), thereby improving considerably BFGS computational efficiency. Moreover, since ℒQNs iterative scheme utilizes single-indexed arrays, only O(n) memory allocations are required. Global convergence properties are investigated. In particular a global convergence result is obtained under suitable assumptions on f. Numerical experiences [7] confirm that ℒQN methods are particularly recommended for large scale problems.
Numerical Linear Algebra With Applications | 2005
Carmine Di Fiore; Stefano Fanelli; Paolo Zellini
SUMMARY Structured matrix algebras L and a generalized BFGS-type iterative scheme have been recently investigated to introduce low-complexity quasi-Newton methods, named LQN, for solving general (non-structured) minimization problems. In this paper we introduce the L k QN methods, which exploit
international symposium on neural networks | 1997
P. Frasconi; Marco Gori; Stefano Fanelli; Marco Protasi
We introduce the notion of suspect families of loading problems in the attempt of formalizing situations in which classical learning algorithms based on local optimization are likely to fail (because of local minima or numerical precision problems). We show that any loading problem belonging to a nonsuspect family can be solved with optimal complexity by a canonical form of gradient descent with forced dynamics (i.e., for this class of problems no algorithm exhibits a better computational complexity than a slightly modified form of backpropagation). The analyses of this paper suggest intriguing links between the shape of the error surface attached to parametrical learning systems (like neural networks) and the computational complexity of the corresponding optimization problem.
international symposium on neural networks | 1998
Monica Bianchini; Stefano Fanelli; Marco Gori; Marco Protasi
The effectiveness of connectionist models in emulating intelligent behaviour is strictly related to the capability of the learning algorithms to find optimal or near-optimal solutions. In this paper, a canonical reduction of gradient descent dynamics is proposed, allowing the formulation of the neural network learning as a finite continuous optimisation problem, under some nonsuspiciousness conditions. In the linear case, the nonsuspect nature of the problem guarantees the implementation of an iterative method with O(n/sup 2/) as computational complexity. Finally, since nonsuspiciousness is a generalisation of the concept of convexity, it is possible to apply this theory to the resolution of nonlinear problems.
international symposium on neural networks | 1993
Stefano Fanelli; M. Di Martino; Marco Protasi
An alternative to the back propagation algorithm for the effective training of multi-layer perceptron (MLP)-networks with binary output is described. The algorithm determines the optimal set of weights by an iterative scheme based on the singular value decomposition (SVD) method and the Fletcher and Reeves version of the conjugate gradient method.<<ETX>>
international conference on mathematics of neural networks models algorithms and applications models algorithms and applications | 1997
Monica Bianchini; Stefano Fanelli; Marco Gori; Marco Protasi
This paper deals with optimal learning and provides a unified viewpoint of most significant results in the field. The focus is on the problem of local minima in the cost function that is likely to affect more or less any learning algorithm. We give some intriguing links between optimal learning and the computational complexity of loading problems. We exhibit a computational model such that the solution of all loading problems giving rise to unimodal error functions require the same time, thus suggesting that they belong to the same computational class.
Archive | 1994
M. Di Martino; Stefano Fanelli; Marco Protasi
In this paper the authors consider some recent” direct methods” for the training of a three-layered feedforward neural network with thresholds: the Algorithm II and III (Barmann eal., 1992), here named FBFBK the Least Squares Backpropagation (LSB) Algorithm (Barmann et al., 1993), and their innovative method called Iterative Conjugate Gradient Singular Value Decomposition (ICGSV D) Algorithm (Di Martino et al., 1993).
International Scholarly Research Notices | 2011
Stefano Fanelli
This paper presents a general and comprehensive description of Optimization Methods, and Algorithms from a novel viewpoint. It is shown, in particular, that Direct Methods, Iterative Methods, and Computer Science Algorithms belong to a well-defined general class of both Finite and Infinite Procedures, characterized by suitable descent directions.
international conference on neural information processing | 1999
C. Di Fiore; Stefano Fanelli; Paolo Zellini