Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael P. Perrone is active.

Publication


Featured researches published by Michael P. Perrone.


international symposium on neural networks | 1992

A soft-competitive splitting rule for adaptive tree-structured neural networks

Michael P. Perrone

An algorithm for generating tree structured neural networks using a soft-competitive recursive partitioning rule is described. It is demonstrated that this algorithm grows robust, honest estimators. Preliminary results on a 10-class, 240-dimensional optical character recognition classification task show that the tree outperforms backpropagation. Arguments are made that suggest why this should be the case. The connection of the soft-competitive splitting rule to the twoing rule is described.<<ETX>>


international symposium on neural networks | 1991

A novel recursive partitioning criterion

Michael P. Perrone

Summary form only given, as follows. A data-driven algorithm for partitioning many-class classification problems has been developed. The algorithm generates tree-structured hybrid networks with controller nets at tree branches and local expert nets at the leaves. The controller nets recursively partition the feature space according to a novel misclassification minimization rule designed to create groupings of the classes which simplify the classification task. Each local expert is trained only dn a subset of the training data corresponding to one of the partitions. The advantage of this approach is that the classification task that each local expert performs is greatly simplified. This simplification helps to avoid the curse of dimensionality and scaling problems by allowing the local expert nets to focus their search for structure in a small portion of the input space.<<ETX>>


conference on learning theory | 1995

Regression NSS: an alternative to cross validation

Michael P. Perrone; Brian S. Blais

The Noise Sensitivity Signature (NSS), originally introduced by Grossman and Lapedes (1993), was proposed as an alternative to cross validation for selecting network complexity. In this paper, we extend NSS to the general problem of regression estimation. We also present results from regularized linear regression simulations which indicate that for problems with few data points, NSS regression estimates perform better than Generalized Cross Validation (GCV) regression estimates [7].


Archive | 1992

When Networks Disagree: Ensemble Methods for Hybrid Neural Networks

Michael P. Perrone; Leon N. Cooper


Archive | 1993

Improving regression estimation: Averaging methods for variance reduction with extensions to general convex measure optimization

Michael P. Perrone


Archive | 1993

When networks disagree: Ensemble method for neural networks

Michael P. Perrone; Leon N. Cooper


neural information processing systems | 1993

Putting It All Together: Methods for Combining Neural Networks

Michael P. Perrone


Archive | 2008

Learning from What's Been Learned: Supervised Learning in Multi-Neural Network Systems*

Michael P. Perrone; Leon N. Cooper


Muscle & Nerve | 1995

Dynamical analysis of neuromuscular transmission jitter

James M. Gilchrist; Michael P. Perrone; John Ross


neural information processing systems | 1994

The Ni1000: High Speed Parallel VLSI for Implementing Multilayer Perceptrons

Michael P. Perrone; Leon N. Cooper

Collaboration


Dive into the Michael P. Perrone's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge