Michael P. Perrone
Brown University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michael P. Perrone.
international symposium on neural networks | 1992
Michael P. Perrone
An algorithm for generating tree structured neural networks using a soft-competitive recursive partitioning rule is described. It is demonstrated that this algorithm grows robust, honest estimators. Preliminary results on a 10-class, 240-dimensional optical character recognition classification task show that the tree outperforms backpropagation. Arguments are made that suggest why this should be the case. The connection of the soft-competitive splitting rule to the twoing rule is described.<<ETX>>
international symposium on neural networks | 1991
Michael P. Perrone
Summary form only given, as follows. A data-driven algorithm for partitioning many-class classification problems has been developed. The algorithm generates tree-structured hybrid networks with controller nets at tree branches and local expert nets at the leaves. The controller nets recursively partition the feature space according to a novel misclassification minimization rule designed to create groupings of the classes which simplify the classification task. Each local expert is trained only dn a subset of the training data corresponding to one of the partitions. The advantage of this approach is that the classification task that each local expert performs is greatly simplified. This simplification helps to avoid the curse of dimensionality and scaling problems by allowing the local expert nets to focus their search for structure in a small portion of the input space.<<ETX>>
conference on learning theory | 1995
Michael P. Perrone; Brian S. Blais
The Noise Sensitivity Signature (NSS), originally introduced by Grossman and Lapedes (1993), was proposed as an alternative to cross validation for selecting network complexity. In this paper, we extend NSS to the general problem of regression estimation. We also present results from regularized linear regression simulations which indicate that for problems with few data points, NSS regression estimates perform better than Generalized Cross Validation (GCV) regression estimates [7].
Archive | 1992
Michael P. Perrone; Leon N. Cooper
Archive | 1993
Michael P. Perrone
Archive | 1993
Michael P. Perrone; Leon N. Cooper
neural information processing systems | 1993
Michael P. Perrone
Archive | 2008
Michael P. Perrone; Leon N. Cooper
Muscle & Nerve | 1995
James M. Gilchrist; Michael P. Perrone; John Ross
neural information processing systems | 1994
Michael P. Perrone; Leon N. Cooper