Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John E. Moody is active.

Publication


Featured researches published by John E. Moody.


Neural Computation | 1989

Fast learning in networks of locally-tuned processing units

John E. Moody; Christian J. Darken

We propose a network architecture which uses a single internal layer of locally-tuned processing units to learn both classification tasks and real-valued function approximations (Moody and Darken 1988). We consider training such networks in a completely supervised manner, but abandon this approach in favor of a more computationally efficient hybrid learning method which combines self-organized and supervised learning. Our networks learn faster than backpropagation for two reasons: the local representations ensure that only a few units respond to any given input, thus reducing computational overhead, and the hybrid learning rules are linear rather than nonlinear, thus leading to faster convergence. Unlike many existing methods for data analysis, our network architecture and learning rules are truly adaptive and are thus appropriate for real-time use.


international symposium on neural networks | 1990

Fast adaptive k -means clustering: some empirical results

Christian J. Darken; John E. Moody

The authors present learning rate schedules for fast adaptive k-means clustering which surpass the standard MacQueen learning rate schedule (J. MacQeen, 1967) in speed and quality of solution by several orders of magnitude for large k. The methods accomplish this by largely overcoming the problems of metastable local minima and nonstationarity of cluster region boundaries which plague the MacQueen approach. The authors use simulation results to compare the clustering performances of four learning rate schedules applied to independently sampled data from a uniform distribution in one and two dimensions


Neural Networks for Signal Processing II Proceedings of the 1992 IEEE Workshop | 1992

Learning rate schedules for faster stochastic gradient search

Christian J. Darken; Joseph T. Chang; John E. Moody

The authors propose a new methodology for creating the first automatically adapting learning rates that achieve the optimal rate of convergence for stochastic gradient descent. Empirical tests agree with theoretical expectations that drift can be used to determine whether the crucial parameter c is large enough. Using this statistic, it will be possible to produce the first adaptive learning rates which converge at optimal speed.<<ETX>>


international conference on artificial intelligence and applications | 1991

Selecting neural network architectures via the prediction risk: application to corporate bond rating prediction

Joachim Utans; John E. Moody

The notion of generalization can be defined precisely as the prediction risk, the expected performance of an estimator on new observations. The authors propose the prediction risk as a measure of the generalization ability of multi-layer perceptron networks and use it to select the optimal network architecture. The prediction risk must be estimated from the available data. The authors approximate the prediction risk by v-fold cross-validation and asymptotic estimates of generalized cross-validation or H. Akaikes (1970) final prediction error. They apply the technique to the problem of predicting corporate bond ratings. This problem is very attractive as a case study, since it is characterized by the limited availability of the data and by the lack of complete a priori information that could be used to impose a structure to the network architecture.<<ETX>>


Neural Computation | 1990

Spontaneous development of modularity in simple cortical models

Alex Chernjavsky; John E. Moody

The existence of modular structures in the organization of nervous systems (e.g., cortical columns, patches of neostriatum, and olfactory glomeruli) is well known. However, the detailed dynamic mechanisms by which such structures develop remain a mystery. We propose a mechanism for the formation of modular structures that utilizes a combination of intrinsic network dynamics and Hebbian learning. Specifically, we show that under certain conditions, layered networks can support spontaneous localized activity patterns, which we call collective excitations, even in the absence of localized or spatially correlated afferent stimulation. These collective excitations can then induce the formation of modular structures in both the afferent and lateral connections via a Hebbian learning mechanism. The networks we consider are spatially homogeneous before learning, but the spontaneous emergence of localized collective excitations and the consequent development of modules in the connection patterns break translational symmetry. The essential conditions required to support collective excitations include internal units with sufficiently high gains and certain patterns of lateral connectivity. Our proposed mechanism is likely to play a role in understanding more complex (and more biologically realistic) systems.


international symposium on neural networks | 1990

Dynamics of lateral interaction networks

John E. Moody

It is pointed out that recurrent lateral connectivity in a layer of processing units gives rise to a rich variety of nonlinear response properties, such as overall gain control, emergent periodic response on a preferred spatial scale (collective excitations), and distributed winner-take-all response. This diversity of response properties is observed in several different classes of simple network architectures, including the additive linear network, the additive sigmoidal network, and the nonlinear shunting network. When Hebbian learning is coupled with network dynamics, these models have been shown to support the development of modular connectivity structures analogous to cortical columns


neural information processing systems | 1991

The Effective Number of Parameters: An Analysis of Generalization and Regularization in Nonlinear Learning Systems

John E. Moody


Archive | 1988

Learning with localized receptive fields

John E. Moody; Christian J. Darken


neural information processing systems | 1988

Fast Learning in Multi-Resolution Hierarchies

John E. Moody


neural information processing systems | 1990

Note on Learning Rate Schedules for Stochastic Optimization

Christian J. Darken; John E. Moody

Collaboration


Dive into the John E. Moody's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alex Chernajvsky

Howard Hughes Medical Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge