David Blackman
University of Pennsylvania
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David Blackman.
IEEE Journal of Solid-state Circuits | 1992
J. Van der Spiegel; P. Mueller; David Blackman; P. Chance; C. Donham; Ralph Etienne-Cummings; Peter R. Kinget
A multichip analog parallel neural network whose architecture, neuron characteristics, synaptic connections, and time constants are modifiable is described. The system has several important features, such as time constants for time-domain computations, interchangeable chips allowing a modifiable gross architecture, and expandability to any arbitrary size. Such an approach allows the exploration of different network architectures for a wide range of applications, in particular dynamic real-world computations. Four different modules (neuron, synapse, time constant, and switch units) have been designed and fabricated in a 2- mu m CMOS technology. About 100 of these modules have been assembled in a fully functional prototype neural computer. An integrated software package for setting the network configuration and characteristics, and monitoring the neuron outputs has been developed as well. The performance of the individual modules as well as the overall system response for several applications was tested successfully. Results of a network for real-time decomposition of acoustical patterns are discussed. >
Archive | 1989
P. Mueller; Jan Van der Spiegel; David Blackman; Timothy Chiu; Thomas Clare; C. Donham; Tzu Pu Hsieh; Marc Loinaz
The design of components for a programmable analog neural computer and simulator is described. The machine can be scaled to any size and is composed of three types of interconnected modules, each containing on a VLSI chip arrays of Neurons, modifiable Synapses and Routing Switches. It runs entirely in analog mode but the connection architecture, synaptic gains and time constants as well as neuron parameters are set digitally from a digital host computer. Each neuron has a limited number of inputs and can be connected to any but not all other neurons.
Neurocomputing | 1992
P. Mueller; Jan Van der Spiegel; Vincent Agami; David Blackman; Peter Chance; C. Donham; Ralph Etienne; Jason Flinn; Jinsoo Kim; Mike Massa; Supun Samarasekera
Abstract A prototype programmable analog neural computer and selected applications are described. The machine is assembled from over 100 custom VLSI modules containing neurons, synapses, routing switches and programmable synaptic time constants. Connection symmetry and modular construction allow expansion to arbitrary size. The network runs in real time analog mode, however connection architecture as well as neuron and synapse parameters are controlled by a digital host that monitors also the network performance through an A/D interface. Programming and monitoring software has been developed and several application examples including the dynamic decomposition of acoustical patterns are described. The machine is intended for real time, real world computations including ATR. In current configuration its maximal speed is equivalent to that of a digital machine capable of more than 1011 flops. A much larger machine is currently under development.
international symposium on neural networks | 1994
J. Van der Spiegel; C. Donham; Ralph Etienne-Cummings; S. Fernando; P. Mueller; David Blackman
A large scale analog neural network has been built and tested for a number of applications. The network contains 1024 neurons with 94 synaptic inputs per neuron. The network is able to receive upto 3072 analog inputs. Among its unique features are the programmability of the architecture and the neural components allowing it to be used as a research and development tool for a large variety of applications. Also the network incorporates programmable time constants which makes it well suited for doing real-time or compressed time computations of temporal patterns. The operation of the neural network is fully analog and parallel, and runs independently of the digital host. The neural elements are custom designed and fabricated in a 1.5 /spl mu/m CMOS technology. They are grouped into three kind of chips: neuron chip, synapse chip, and the time constant and switch matrix chip. These chips or modules are placed on 16 printed circuit boards each containing 48 chips. Up to 3,072 buffered analog inputs and outputs are available.<<ETX>>
international symposium on neural networks | 1991
P. Mueller; J. Van der Spiegel; V. Agami; P. Aziz; David Blackman; P. Chance; A. Choudhury; C. Donham; R. Etienne; L. Jones; Peter R. Kinget; W. von Koch; Jane T. Kim; J. Xin
A programmable analog neural computer and selected applications are described. The machine was assembled from over 100 custom VLSI modules containing neurons, synapses, routing switches, and programmable synaptic time constants. Connection symmetry and modular construction allow expansion to arbitrary size. The network runs in real-time analog mode. Connection architecture as well as neuron and synapse parameters are controlled by a digital host that monitors the network performance through a digital/analog interface. Programming and monitoring software has been developed. Several application examples, including the dynamic decomposition of acoustical patterns, are described. The machine is intended to real-time, real-world computations. In current configuration its maximal speed is equivalent to that of a digital machine capable of more than 10/sup 12/ FLOPS.<<ETX>>
1993 Computer Architectures for Machine Perception | 1993
Ralph Etienne-Cummings; J. Van der Spiegel; C. Donham; S. Fernando; R. Hathaway; P. Mueller; David Blackman
The application of a general purpose analog neural computer (GPANC) and a smart silicon retina to target recognition, acquisition and tracking is discussed. The GPANC is designed as a general purpose tool for the implementation of real time neural based solutions to real world problems. It is composed of modules which mimic biological neurons, synapses and axon/dendrites. The modules are fully programmable and are arranged in macro cells to facilitate gross expansion of the computer. The presented version is composed of 10/sup 3/ neurons, 10/sup 5/ synapses, 10/sup 4/ synaptic time constants and 6 /spl times/ 10/sup 5/ interconnection switches. Its computation rate is 10/sup 11/ CPS or it can solve 10/sup 3/ nonlinear functions of 10/sup 4/ coupled first order differential equations in real time. Except for a digital host for programming the GPANC, it operates in full continuous time analog mode and offers temporal computational capabilities. The silicon retina is designed for autonomous target acquisition and tracking and serves as the front-end to the GPANS. It features a space variant layout of photoreceptors, logarithmic compress of incident light intensity, edge detection, motion detection in the fovea and temporal modulation detection in the periphery. All computation circuits are implemented at the focal plane. The peripheral pixels report the location of arriving targets which are then foveated. The fovea is composed of an array of densely pack photoreceptors where full 2D velocity, spanning three orders of magnitude, is computed. Using a closed loop velocity error correction technique, the targets velocity relative to the retina is zeroed. Therefore, using the periphery of retina for acquiring a target, the GPANC for target recognition and system control, and the fovea of retina for tracking, a fully autonomous targeting system can be realized.
5th Congress of the Brazilian Society of Microelectronics | 1990
Jan Van der Spiegel; P. Mueller; David Blackman; C. Donham; Ralph Etienne-Cummings; P. Aziz; A. Choudhury; L. Jones; J. Xin
This paper gives an overview of the principles and hardware realizations of artificial neural networks. The first section describes the operation of neural networks, using simple examples to illustrate some of its key properties. Next the different architectures are described, including single and multiple perceptron networks, Hopfield and Kohonen nets. A brief discussion of the learning rules employed in feedforward and feedback networks follows. The final section discusses hardware implementations of neural systems with emphasis on analog VLSI. Different approaches for the realizations of neurons and synapses are described. A brief comparison between analog and digital techniques is given.
Proceedings of SPIE | 1992
P. Mueller; Jan Van der Spiegel; David Blackman; C. Donham; Ralph Cummings
A prototype programmable analog neural computer has been assembled from over 100 custom VLSI modules containing neurons, synapses, routing switches, and programmable synaptic time constants. The modules are directly interconnected and arbitrary network configurations can be programmed. Connection symmetry and modular construction allow expansion of the network to any size. The network runs in real time analog mode, but connection architecture as well as neuron and synapse parameters are controlled by a digital host. Network performance is monitored by the host through an A/D interface and used in the implementation of learning algorithms. The machine is intended for real time, real world computations. In its current configuration maximal speed is equivalent to that of a digital machine capable of 1011 FLOPS. The programmable synaptic time constants permit the real time computation of temporal patterns as they occur in speech and other acoustic signals. Several applications involving the dynamic decomposition and recognition of acoustical patterns including speech signals (phonemes) are described. The decomposition network is loosely based on the primary auditory system of higher vertebrates. It extracts and represents by the activity in different neuron arrays the following pattern primitives: frequency, bandwidth, amplitude, amplitude modulation, amplitude modulation frequency, frequency modulation, frequency modulation frequency, duration, sequence. The frequency tuned units are the first stage and form the input space for subsequent stages that extract the other primitives, e.g., bandwidth, amplitude modulation, etc., for different frequency bands. Acoustic input generates highly specific, relatively sparse distributed activity in this feature space, which is decoded and recognized by units trained by specific input patterns such as phonemes or diphones or active sonar patterns. Through simple feedback connections in conjunction with synaptic time constants the neurons can be transformed into spiking units resembling biological neurons and networks of such units can be used in simulations of small biological neural assemblies. A larger machine, with much higher component count, speed and density as well as higher resolution of synaptic weights and time constants is currently under development. Some specific design issues for the construction of larger machines including selection of optimal component parameters, high density interconnect methods, and control software are discussed.
international symposium on vlsi technology systems and applications | 1991
J. Van der Spiegel; P. Mueller; V. Agami; P. Aziz; David Blackman; P. Chance; A. Choudhury; C. Donham; Ralph Etienne-Cummings; L. Jones; Jp Kim; Peter R. Kinget; M. Massa; W. von Koch; J. Xin
A multichip, programmable analog neural network for real-time dynamic computations is described. The networks interconnection structure, the neuron characteristics, synaptic connections, and synaptic time constant are modifiable. The chips are designed to allow a modular and expandable gross architecture that can be adjusted to the complexity of the task. The network operates fully analog in real time. However, a digital host is used to set the network parameters and monitor the neuron outputs. A prototype neural computer consisting of 72 neurons has been assembled and tested. The network has been successfully configured for several applications and found to have a performance that is equivalent to a digital machine of 10/sup 11/ FLOPS.<<ETX>>
An introduction to neural and electronic networks | 1990
P. Mueller; David Blackman; Roy Furman