Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Charles M. Higgins is active.

Publication


Featured researches published by Charles M. Higgins.


international conference of the ieee engineering in medicine and biology society | 2006

A Navigation Aid for the Blind Using Tactile-Visual Sensory Substitution

Lise A. Johnson; Charles M. Higgins

The objective of this study is to improve the quality of life for the visually impaired by restoring their ability to self-navigate. In this paper we describe a compact, wearable device that converts visual information into a tactile signal. This device, constructed entirely from commercially available parts, enables the user to perceive distant objects via a different sensory modality. Preliminary data suggest that this device is useful for object avoidance in simple environments


Neural Computation | 1992

Rule-based neural networks for classification and probability estimation

Rodney M. Goodman; Charles M. Higgins; John Miller; Padhraic Smyth

In this paper we propose a network architecture that combines a rule-based approach with that of the neural network paradigm. Our primary motivation for this is to ensure that the knowledge embodied in the network is explicitly encoded in the form of understandable rules. This enables the networks decision to be understood, and provides an audit trail of how that decision was arrived at. We utilize an information theoretic approach to learning a model of the domain knowledge from examples. This model takes the form of a set of probabilistic conjunctive rules between discrete input evidence variables and output class variables. These rules are then mapped onto the weights and nodes of a feedforward neural network resulting in a directly specified architecture. The network acts as parallel Bayesian classifier, but more importantly, can also output posterior probability estimates of the class variables. Empirical tests on a number of data sets show that the rule-based classifier performs comparably with standard neural network classifiers, while possessing unique advantages in terms of knowledge representation and probability estimation.


IEEE Transactions on Fuzzy Systems | 1994

Fuzzy rule-based networks for control

Charles M. Higgins; Rodney M. Goodman

The authors present a method for learning fuzzy logic membership functions and rules to approximate a numerical function from a set of examples of the functions independent variables and the resulting function value. This method uses a three-step approach to building a complete function approximation system: first, learning the membership functions and creating a cell-based rule representation; second, simplifying the cell-based rules using an information-theoretic approach for induction of rules from discrete-valued data; and, finally, constructing a computational (neural) network to compute the function value given its independent variables. This function approximation system is demonstrated with a simple control example: learning the truck and trailer backer-upper control system. >


Visual Neuroscience | 2004

The computational basis of an identified neuronal circuit for elementary motion detection in dipterous insects.

Charles M. Higgins; John K. Douglass; Nicholas J. Strausfeld

Based on comparative anatomical studies and electrophysiological experiments, we have identified a conserved subset of neurons in the lamina, medulla, and lobula of dipterous insects that are involved in retinotopic visual motion direction selectivity. Working from the photoreceptors inward, this neuronal subset includes lamina amacrine (alpha) cells, lamina monopolar (L2) cells, the basket T-cell (T1 or beta), the transmedullary cell Tm1, and the T5 bushy T-cell. Two GABA-immunoreactive neurons, the transmedullary cell Tm9 and a local interneuron at the level of T5 dendrites, are also implicated in the motion computation. We suggest that these neurons comprise the small-field elementary motion detector circuits the outputs of which are integrated by wide-field lobula plate tangential cells. We show that a computational model based on the available data about these neurons is consistent with existing models of biological elementary motion detection, and present a comparable version of the Hassenstein-Reichardt (HR) correlation model. Further, by using the model to synthesize a generic tangential cell, we show that it can account for the responses of lobula plate tangential cells to a wide range of transient stimuli, including responses which cannot be predicted using the HR model. This computational model of elementary motion detection is the first which derives specifically from the functional organization of a subset of retinotopic neurons supplying the lobula plate. A key prediction of this model is that elementary motion detector circuits respond quite differently to small-field transient stimulation than do spatially integrated motion processing neurons as observed in the lobula plate. In addition, this model suggests that the retinotopic motion information provided to wide-field motion-sensitive cells in the lobula is derived from a less refined stage of processing than motion inputs to the lobula plate.


IEEE Transactions on Circuits and Systems Ii: Analog and Digital Signal Processing | 1999

Pulse-based 2-D motion sensors

Charles M. Higgins; Rainer A. Deutschmann; Christof Koch

We present two compact CMOS integrated circuits for computing the two-dimensional (2-D) local direction of motion of an image focused directly onto the chip. These circuits incorporate onboard photoreceptors and focal plane motion processing. With fully functional 14/spl times/13 and 12/spl times/13 implementations consuming less than 50 /spl mu/W per pixel, we conclude that practical pixel resolutions of at least 64/spl times/64 are easily achievable. Measurements characterizing the elementary one-dimensional motion detectors are presented along with a discussion of 2-D performance and example 2-D motion vector fields. As an example application of the sensor, it is shown that the array as fabricated can directly compute the focus of expansion of a 2-D motion vector field.


Analog Integrated Circuits and Signal Processing | 2000

A Modular Multi-Chip Neuromorphic Architecture for Real-Time Visual Motion Processing

Charles M. Higgins; Christof Koch

The extent of pixel-parallel focal plane image processing is limited by pixel area and imager fill factor. In this paper, we describe a novel multi-chip neuromorphic VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex pixel-parallel motion processing than is possible in the focal plane. This multi-chip system retains the primary advantages of focal plane neuromorphic image processors: low-power consumption, continuous-time operation, and small size. The two basic VLSI building blocks are a photosensitive sender chip which incorporates a 2D imager array and transmits the position of moving spatial edges, and a receiver chip which computes a 2D optical flow vector field from the edge information. The elementary two-chip motion processing system consisting of a single sender and receiver is first characterized. Subsequently, two three-chip motion processing systems are described. The first three-chip system uses two sender chips to compute the presence of motion only at a particular stereoscopic depth from the imagers. The second three-chip system uses two receivers to simultaneously compute a linear and polar topographic mapping of the image plane, resulting in information about image translation, rotation, and expansion. These three-chip systems demonstrate the modularity and flexibility of the multi-chip neuromorphic approach.


IEEE Sensors Journal | 2002

A biologically inspired modular VLSI system for visual measurement of self-motion

Charles M. Higgins; Shaikh Arif Shams

We introduce a biologically inspired computational architecture for small-field detection and wide-field spatial integration of visual motion based on the general organizing principles of visual motion processing common to organisms from insects to primates. This highly parallel architecture begins with two-dimensional (2-D) image transduction and signal conditioning, performs small-field motion detection with a number of parallel motion arrays, and then spatially integrates the small-field motion units to synthesize units sensitive to complex wide-field patterns of visual motion. We present a theoretical analysis demonstrating the architectures potential in discrimination of wide-field motion patterns such as those which might be generated by self-motion. A custom VLSI hardware implementation of this architecture is also described, incorporating both analog and digital circuitry. The individual custom VLSI elements are analyzed and characterized, and system-level test results demonstrate the ability of the system to selectively respond to certain motion patterns, such as those that might be encountered in self-motion, at the exclusion of others.


The Journal of Experimental Biology | 2010

The spatial frequency tuning of optic-flow-dependent behaviors in the bumblebee Bombus impatiens

Jonathan P. Dyhr; Charles M. Higgins

SUMMARY Insects use visual estimates of flight speed for a variety of behaviors, including visual navigation, odometry, grazing landings and flight speed control, but the neuronal mechanisms underlying speed detection remain unknown. Although many models and theories have been proposed for how the brain extracts the angular speed of the retinal image, termed optic flow, we lack the detailed electrophysiological and behavioral data necessary to conclusively support any one model. One key property by which different models of motion detection can be differentiated is their spatiotemporal frequency tuning. Numerous studies have suggested that optic-flow-dependent behaviors are largely insensitive to the spatial frequency of a visual stimulus, but they have sampled only a narrow range of spatial frequencies, have not always used narrowband stimuli, and have yielded slightly different results between studies based on the behaviors being investigated. In this study, we present a detailed analysis of the spatial frequency dependence of the centering response in the bumblebee Bombus impatiens using sinusoidal and square wave patterns.


IEEE Transactions on Circuits and Systems | 2005

Reconfigurable biologically inspired visual motion systems using modular neuromorphic VLSI chips

Erhan Ozalevli; Charles M. Higgins

Visual motion information provides a variety of clues that enable biological organisms from insects to primates to efficiently navigate in unstructured environments. We present modular mixed-signal very large-scale integration (VLSI) implementations of the three most prominent biological models of visual motion detection. A novel feature of these designs is the use of spike integration circuitry to implement the necessary temporal filtering. We show how such modular VLSI building blocks make it possible to build highly powerful and flexible vision systems. These three biomimetic motion algorithms are fully characterized and compared in performance. The visual motion detection models are each implemented on separate VLSI chips, but utilize a common silicon retina chip to transmit changes in contrast, and thus four separate mixed-signal VLSI designs are described. Characterization results of these sensors show that each has a saturating response to contrast to moving stimuli, and that the direction of motion of a sinusoidal grating can be detected down to less than 5% contrast, and over more than an order of magnitude in velocity, while retaining modest power consumption.


conference on advanced research in vlsi | 1999

Multi-chip neuromorphic motion processing

Charles M. Higgins; Christof Koch

We describe a multi-chip CMOS VLSI visual motion processing system which combines analog circuitry with an asynchronous digital interchip communications protocol to allow more complex motion processing than is possible with all the circuitry in the focal plane. The two basic VLSI building blocks are a sender chip which incorporates a 2D imager array and transmits the position of moving spatial edges, and a receiver chip which computes a 2D optical flow vector field from the edge information. The elementary two-chip motion processing system consisting of a single sender and receiver is first characterized. Subsequently, two three-chip motion processing systems are described. The first such system uses two sender chips to compute the presence of motion only at a particular stereoscopic disparity. The second such system uses two receivers to simultaneously compute a linear and polar topographic mapping of the image plane, resulting in information about image translation, rotation, and expansion. These three-chip systems demonstrate the modularity and flexibility of the multi-chip neuromorphic approach.

Collaboration


Dive into the Charles M. Higgins's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rodney M. Goodman

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Christof Koch

Allen Institute for Brain Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rainer A. Deutschmann

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge