Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John G. Elias is active.

Publication


Featured researches published by John G. Elias.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2001

Multi-Touch: A New Tactile 2-D Gesture Interface for Human-Computer Interaction

Wayne Carl Westerman; John G. Elias; Alan Hedge

The naturalness and variety of a touch-based hand gesture interface offers new opportunities for human-computer interaction. Using a new type of capacitive sensor array, a Multi-Touch Surface (MTS) can be created that is not limited in size, that can be presented in many configurations, that is robust under a variety of environmental operating conditions, and that is very thin. Typing and gesture recognition built into the Multi-Touch Surface allow users to type and perform bilateral gestures on the same surface area and in a smaller footprint than is required by current keyboard and mouse technologies. The present approach interprets asynchronous touches on the surface as conventional single-finger typing, while motions initiated by chords are interpreted as pointing, clicking, gesture commands, or hand resting. This approach requires learning only a few new chords for graphical manipulation, rather than a vocabulary of new chords for typing the whole alphabet. Graphical manipulation seems a better use of chords in todays computing environment.


Neural Computation | 1993

Artificial dendritic trees

John G. Elias

The electronic architecture and dynamic signal processing capabilities of an artificial dendritic tree that can be used to process and classify dynamic signals is described. The electrical circuit architecture is modeled after neurons that have spatially extensive dendritic trees. The artificial dendritic tree is a hybrid VLSI circuit and is sensitive to both temporal and spatial signal characteristics. It does not use the conventional neural network concept of weights, and as such it does not use multipliers, adders, look-up-tables, microprocessors, or other complex computational units to process signals. The weights of conventional neural networks, which take the form of numerical, resistive, voltage, or current values, but do not have any spatial or temporal content, are replaced with connections whose spatial location have both a temporal and scaling significance.


IEEE Transactions on Neural Networks | 1995

Switched-capacitor neuromorphs with wide-range variable dynamics

John G. Elias; David P. M. Northmore

The use of switched capacitors as wide-range, programmable resistive elements in spatially extensive artificial dendritic trees (ADTs) is described. We show that silicon neuro-morphs with ADTs can produce impulse responses that last millions of times longer than the initiating impulse and that dynamical responses are tunable in both shape and duration over a wide range. The switched-capacitor resistors forming a dendritic tree are shown indirectly to have a useful programmable resistance range between 500 KOmega and 1000 GOmega. Experimental results are presented that show variable impulse response functions, tunable frequency selectivity, and rate-invariance of spatiotemporal pattern responses.


Archive | 2006

A Neuromorphic System

David P. M. Northmore; John D. Moses; John G. Elias

The essential functions of neurons can be emulated electronically on silicon chips. We describe such a neuron analogue, or neuromorph, that is compact and low power, with sufficient flexibility that it could perform as a general-purpose unit in networks for controlling robots or for use as implantable neural prostheses. We illustrate some possible applications by a dynamical network that recognizes spatiotemporal patterns and by a network that uses a biologically inspired learning rule to develop sensory-guided behavior in a moving robot. Finally, design requirements for neuromorphic systems are discussed.


Neural Computation | 1997

An analog memory circuit for spiking silicon neurons

John G. Elias; David P. M. Northmore; Wayne Carl Westerman

A simple circuit is described that functions as an analog memory whose state and dynamics are directly controlled by pulsatile inputs. The circuit has been incorporated into a silicon neuron with a spatially extensive dendritic tree as a means of controlling the spike firing threshold of an integrate-and-fire soma. Spiking activity generated by the neuron itself and by other units in a network can thereby regulate the neurons excitability over time periods ranging from milliseconds to many minutes. Experimental results are presented showing applications to temporal edge sharpening, bistable behavior, and a network that learns in the manner of classical conditioning.


world congress on computational intelligence | 1994

VLSI circuit synthesis using a parallel genetic algorithm

Mike Davis; Luoping Liu; John G. Elias

A parallel implementation of a genetic algorithm used to evolve simple analog VLSI circuits is described. The parallel computer system consisted of twenty distributed SPARC workstations whose computational activity is controlled by the parallel environment coordination language Linda. Work-in-progress on using the parallel GA to realize optimized circuits and to discover new types of equivalent-function circuits is presented. The use of biologically inspired development rules to limit the scope of circuits generated by recombination operators to circuits that have an increased chance of surviving is briefly discussed.<<ETX>>


[Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks | 1992

Genetic generation of connection patterns for a dynamic artificial neural network

John G. Elias

Work-in-progress on the use of a specialized genetic algorithm for training a new type of dynamic artificial neural network is described. The network architecture is completely specified by a list of addresses that are used to connect signal sources to specific artificial synapses, which have both a temporal and spatial significance. The number of different connection patterns is a combinational problem which grows factorially as the number of artificial synapses in the network and the number of sensor elements increases. The network is implemented primarily in analog electronic hardware and constructed from artificial dendritic trees which exhibit a spatiotemporal processing capability that is modeled after morphologically complex biological neurons. The author describes work-in-progress on using the specialized genetic algorithm, which has an embedded optimizer in place of the standard mutation operator, for training a dynamic neural network to follow the position of a target moving across an image sensor array.<<ETX>>


international symposium on neural networks | 1992

Silicon implementation of an artificial dendritic tree

John G. Elias; H.-H. Chu; Samer M. Meshreki

The silicon implementation of an artificial passive dendritic tree which can be used to process and classify dynamic signals is described. The electrical circuit architecture is modeled after complex neurons in the vertebrate brain which have spatially extensive dendritic tree structures that support large numbers of synapses. The circuit is primarily analog and, as in the biological model system, is virtually immune to process variations and other factors which often plague more conventional circuits. The nonlinear circuit is sensitive to both temporal and spatial signal characteristics but does not make use of the conventional neural network concept of weights, and as such does not use multipliers, adders, or other complex computational devices. As in biological neuronal circuits, a high degree of local connectivity is required. However, unlike biology, multiplexing of connections is done to reduce the number of conductors to a reasonable level for standard packages.<<ETX>>


Neural Computation | 1996

Spike train processing by a silicon neuromorph: The role of sublinear summation in dendrites

David P. M. Northmore; John G. Elias

A dendritic tree, as part of a silicon neuromorph, was modeled in VLSI as a multibranched, passive cable structure with multiple synaptic sites that either depolarize or hyperpolarize local membrane patches, thereby raising or lowering the probability of spike generation of an integrate-and-fire soma. As expected from previous theoretical analyses, contemporaneous synaptic activation at widely separated sites on the artificial tree resulted in near-linear summation, as did neighboring excitatory and inhibitory activations. Activation of synapses of the same type close in time and space produced local saturation of potential, resulting in spike train processing capabilities not possible with linear summation alone. The resulting sublinear synaptic summation, as well as being physiologically plausible, is sufficient for a variety of spike train processing functions. With the appropriate arrangement of synaptic inputs on its dendritic tree, a neuromorph was shown to discriminate input pulse intervals and patterns, pulse train frequencies, and detect correlation between input trains.


world congress on computational intelligence | 1994

Evolving synaptic connections for a silicon neuromorph

David P. M. Northmore; John G. Elias

Our VLSI neuromorphs possess extensive dendritic trees with hundreds of excitatory and inhibitory synaptic sites. Useful signal processing can be achieved by evolving the appropriate connections to the synapses using genetic algorithms. In order to reduce the very large solution space, schemes for specifying connections have been borrowed from neural development. Results show the evolution of connections to a single neuromorph for the discrimination of temporal and spatio-temporal patterns.<<ETX>>

Collaboration


Dive into the John G. Elias's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ben Chang

University of Delaware

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

H.-H. Chu

University of Delaware

View shared research outputs
Top Co-Authors

Avatar

Hsu-Hua Chu

University of Delaware

View shared research outputs
Top Co-Authors

Avatar

John D. Moses

Los Alamos National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge