Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Neil E. Cotter is active.

Publication


Featured researches published by Neil E. Cotter.


IEEE Transactions on Neural Networks | 1990

The Stone-Weierstrass theorem and its application to neural networks

Neil E. Cotter

The Stone-Weierstrass theorem and its terminology are reviewed, and neural network architectures based on this theorem are presented. Specifically, exponential functions, polynomials, partial fractions, and Boolean functions are used to create networks capable of approximating arbitrary bounded measurable functions. A modified logistic network satisfying the theorem is proposed as an alternative to commonly used networks based on logistic squashing functions.


Neural Networks | 1992

Original Contribution: The CMAC and a theorem of Kolmogorov

Neil E. Cotter; Thierry J. Guillerm

This paper shows that the Cerebellar Model Articulation Controller (CMAC) is structurally similar to networks derived from a theorem of Kolmogorov. As a foundation for this comparison, we review of a proof of Kolmogorovs theorem. From this proof and an analysis of the CMAC we derive two lemmas describing functions that cannot be modeled by a CMAC. The first lemma states that such functions have zero average value over response regions of CMAC association cells. The second lemma states that such functions have local oscillations exceeding a quantifiable percentage of the global maximum absolute value of error. This second lemma gives bounds on errors caused by hash tables used as association cells in the CMAC. We present three examples illustrating the lemmas.


international symposium on neural networks | 1990

Fixed-weight networks can learn

Neil E. Cotter; Peter R. Conwell

A theorem describing how fixed-weight recurrent neural networks can approximate adaptive-weight learning algorithms is proved. The theorem applies to most networks and learning algorithms currently in use. It is concluded from the theorem that a system which exhibits learning behavior may exhibit no synaptic weight modifications. This idea is demonstrated by transforming a backward error propagation network into a fixed-weight system


IEEE Transactions on Neural Networks | 1992

A pulsed neural network capable of universal approximation

Neil E. Cotter; Omar N. Mian

The authors describe a pulsed network version of the cerebellar model articulation controller (CMAC), popularized by Albus (1981). The network produces output pulses whose times of occurrence are a function of input pulse intervals. Within limits imposed by causality conditions, this function can approximate any bounded measurable function on a compact domain. Simulation results demonstrate the viability of training the network with a least mean square algorithm.


international symposium on neural networks | 1991

Learning algorithms and fixed dynamics

Neil E. Cotter; Peter R. Conwell

The authors discuss the equivalence of learning algorithms and nonlinear dynamic systems whose differential equations have fixed coefficients. They show how backpropagation transforms into a fixed-weight recursive neural network suitable for VLSI or optical implementations. The transformation is quite general and implies that understanding physiological networks may require one to determine the values of fixed parameters distributed throughout a network. Equivalently, a particular synaptic weight update mechanism such as Hebbian learning could likely be used to implement many known learning algorithms. The authors use the transformation process to illustrate why a network whose only variable weights are hidden-layer thresholds is capable of universal approximation.<<ETX>>


international symposium on neural networks | 1990

Neural network based controllers for a single-degree-of-freedom robotic arm

K. Wilhelmsen; Neil E. Cotter

The properties of different neural network architectures in adaptive nonlinear robotic control are examined. For the comparison of architectures, a specific robotic problem was developed. This robotic system was controlled by three different neural-network-based architectures, and the controllers were analyzed and compared. It was found that significant improvements in control can be made by tailoring the neural network inputs and error structure. Also, temporal shifting of error information in the neural network backward error propagation can modify the spectral density of the controller function


Neural Computation | 1993

Universal approximation by phase series and fixed-weight networks

Neil E. Cotter; Peter R. Conwell

In this note we show that weak (specified energy bound) universal approximation by neural networks is possible if variable synaptic weights are brought in as network inputs rather than being embedded in a network. We illustrate this idea with a Fourier series network that we transform into what we call a phase series network. The transformation only increases the number of neurons by a factor of two.


international symposium on neural networks | 1991

A diffusion process for global optimization in neural networks

Thierry J. Guillerm; Neil E. Cotter

The authors modify the usual gradient descent method to push the process in the weight space to have a Gibb or Boltzmann distribution, and find the global minima of the average performance measure of a neural network. The goal is to present a method which guarantees that a global minima of the average performance measure in the weight space will be located, given sufficient computational time. The method of simulated annealing is a mathematical tool which forces a system to behave like a natural annealing process. The method chosen for the global optimization of continuous networks is based on the modification of the differential equation associated with local optimization. The global optimization theory is derived for networks whose learning rules are supervised, whose nodes are bounded Lipschitz continuous functions, and whose performance measure is smooth.<<ETX>>


international symposium on neural networks | 1991

Neural network based pattern recognition for sequenced DNA autoradiograms

M. Murdock; Neil E. Cotter; R. Gesteland

Summary form only given. The three-layer, backward error propagation neural network has been applied to the problem of reading sequenced DNA autoradiograms. The network is used for band identification by extracting two features: band intensity level and band intensity gradient. A training set of 16000 12*12 pixel patterns is generated using an autoradiogram degradation model that accounts for radioisotope source crossfire, background, diffusion, contrast, surface stress artifacts, film grain, and quantum and convolutional noise. Trained with those patterns, the network successfully classified images from five previously unseen autoradiograms according to whether these two low-level features were present and provided the degree to which the features were present or absent.<<ETX>>


international symposium on neural networks | 1989

Convergence properties of a pulsed neural network

Omar N. Mian; Neil E. Cotter

A neural network based on pulse-width-modulated signals is described. Simulation results for the pulsed equivalent of a two-neuron Hopfield network show that proper convergence is achieved without heavy low-pass filtering. This suggests that temporal information may be of little importance in a certain class of pulsed neural network architecturesSummary form only given, as follows. A neural network based on pulse-width-modulated signals is described. Simulation results for the pulsed equivalent of a two-neuron Hopfield network show that proper convergence is achieved with only meager low-pass filtering. This suggests that temporal information may be of little importance in a certain class of pulsed neural network architectures.<<ETX>>

Collaboration


Dive into the Neil E. Cotter's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lee Brinton

Salt Lake Community College

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge