Marco Protasi
University of Rome Tor Vergata
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Marco Protasi.
international symposium on neural networks | 1997
P. Frasconi; Marco Gori; Stefano Fanelli; Marco Protasi
We introduce the notion of suspect families of loading problems in the attempt of formalizing situations in which classical learning algorithms based on local optimization are likely to fail (because of local minima or numerical precision problems). We show that any loading problem belonging to a nonsuspect family can be solved with optimal complexity by a canonical form of gradient descent with forced dynamics (i.e., for this class of problems no algorithm exhibits a better computational complexity than a slightly modified form of backpropagation). The analyses of this paper suggest intriguing links between the shape of the error surface attached to parametrical learning systems (like neural networks) and the computational complexity of the corresponding optimization problem.
international symposium on neural networks | 1998
Monica Bianchini; Stefano Fanelli; Marco Gori; Marco Protasi
The effectiveness of connectionist models in emulating intelligent behaviour is strictly related to the capability of the learning algorithms to find optimal or near-optimal solutions. In this paper, a canonical reduction of gradient descent dynamics is proposed, allowing the formulation of the neural network learning as a finite continuous optimisation problem, under some nonsuspiciousness conditions. In the linear case, the nonsuspect nature of the problem guarantees the implementation of an iterative method with O(n/sup 2/) as computational complexity. Finally, since nonsuspiciousness is a generalisation of the concept of convexity, it is possible to apply this theory to the resolution of nonlinear problems.
international symposium on neural networks | 1993
Stefano Fanelli; M. Di Martino; Marco Protasi
An alternative to the back propagation algorithm for the effective training of multi-layer perceptron (MLP)-networks with binary output is described. The algorithm determines the optimal set of weights by an iterative scheme based on the singular value decomposition (SVD) method and the Fletcher and Reeves version of the conjugate gradient method.<<ETX>>
international conference on mathematics of neural networks models algorithms and applications models algorithms and applications | 1997
Monica Bianchini; Stefano Fanelli; Marco Gori; Marco Protasi
This paper deals with optimal learning and provides a unified viewpoint of most significant results in the field. The focus is on the problem of local minima in the cost function that is likely to affect more or less any learning algorithm. We give some intriguing links between optimal learning and the computational complexity of loading problems. We exhibit a computational model such that the solution of all loading problems giving rise to unimodal error functions require the same time, thus suggesting that they belong to the same computational class.
Archive | 1994
M. Di Martino; Stefano Fanelli; Marco Protasi
In this paper the authors consider some recent” direct methods” for the training of a three-layered feedforward neural network with thresholds: the Algorithm II and III (Barmann eal., 1992), here named FBFBK the Least Squares Backpropagation (LSB) Algorithm (Barmann et al., 1993), and their innovative method called Iterative Conjugate Gradient Singular Value Decomposition (ICGSV D) Algorithm (Di Martino et al., 1993).
IEEE Transactions on Neural Networks | 1996
M. Di Martino; Stefano Fanelli; Marco Protasi
international colloquium on automata languages and programming | 1988
Ludek Kucera; Alberto Marchetti-Spaccamela; Marco Protasi
international symposium on neural networks | 1993
M. Di Martino; Stefano Fanelli; Marco Protasi
Proceedings of SPIE | 1993
Alessandra Di Medio; Stefano Fanelli; Marco Protasi
Archive | 1999
Giorgio Ausiello; Pierluigi Crescenzi; Giorgio Gambosi; Viggo Kann; Alberto Marchetti-Spaccamela; Marco Protasi