David H. Goldberg
Johns Hopkins University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David H. Goldberg.
Neuroinformatics | 2008
Daniel Gardner; Huda Akil; Giorgio A. Ascoli; Douglas M. Bowden; William J. Bug; Duncan E. Donohue; David H. Goldberg; Bernice Grafstein; Jeffrey S. Grethe; Amarnath Gupta; Maryam Halavi; David N. Kennedy; Luis N. Marenco; Maryann E. Martone; Perry L. Miller; Hans-Michael Müller; Adrian Robert; Gordon M. Shepherd; Paul W. Sternberg; David C. Van Essen; Robert W. Williams
With support from the Institutes and Centers forming the NIH Blueprint for Neuroscience Research, we have designed and implemented a new initiative for integrating access to and use of Web-based neuroscience resources: the Neuroscience Information Framework. The Framework arises from the expressed need of the neuroscience community for neuroinformatic tools and resources to aid scientific inquiry, builds upon prior development of neuroinformatics by the Human Brain Project and others, and directly derives from the Society for Neuroscience’s Neuroscience Database Gateway. Partnered with the Society, its Neuroinformatics Committee, and volunteer consultant-collaborators, our multi-site consortium has developed: (1) a comprehensive, dynamic, inventory of Web-accessible neuroscience resources, (2) an extended and integrated terminology describing resources and contents, and (3) a framework accepting and aiding concept-based queries. Evolving instantiations of the Framework may be viewed at http://nif.nih.gov, http://neurogateway.org, and other sites as they come on line.
Neural Networks | 2001
David H. Goldberg; Gert Cauwenberghs; Andreas G. Andreou
We present a scheme for implementing highly-connected, reconfigurable networks of integrate-and-fire neurons in VLSI. Neural activity is encoded by spikes, where the address of an active neuron is communicated through an asynchronous request and acknowledgement cycle. We employ probabilistic transmission of spikes to implement continuous-valued synaptic weights, and memory-based look-up tables to implement arbitrary interconnection topologies. The scheme is modular and scalable, and lends itself to the implementation of multi-chip network architectures. Results from a prototype system with 1024 analog VLSI integrate-and-fire neurons, each with up to 128 probabilistic synapses, demonstrate these concepts in an image processing task.
IEEE Transactions on Circuits and Systems I-regular Papers | 2004
Pedro Julián; Andreas G. Andreou; Laurence Riddle; Shihab A. Shamma; David H. Goldberg; Gert Cauwenberghs
Sound localization using energy-aware hardware for sensor networks nodes is a problem with many applications in surveillance and security. In this paper, we evaluate four algorithms for sound localization using signals recorded in a natural environment with an array of commercial off-the-shelf microelectromechanical systems microphones and a specially designed compact acoustic enclosure. We evaluate performance of the algorithms and their hardware complexity which relates directly to energy consumption.
Neuroinformatics | 2009
David H. Goldberg; Jonathan D. Victor; Esther P. Gardner; Daniel Gardner
Conventional methods widely available for the analysis of spike trains and related neural data include various time- and frequency-domain analyses, such as peri-event and interspike interval histograms, spectral measures, and probability distributions. Information theoretic methods are increasingly recognized as significant tools for the analysis of spike train data. However, developing robust implementations of these methods can be time-consuming, and determining applicability to neural recordings can require expertise. In order to facilitate more widespread adoption of these informative methods by the neuroscience community, we have developed the Spike Train Analysis Toolkit. STAToolkit is a software package which implements, documents, and guides application of several information-theoretic spike train analysis techniques, thus minimizing the effort needed to adopt and use them. This implementation behaves like a typical Matlab toolbox, but the underlying computations are coded in C for portability, optimized for efficiency, and interfaced with Matlab via the MEX framework. STAToolkit runs on any of three major platforms: Windows, Mac OS, and Linux. The toolkit reads input from files with an easy-to-generate text-based, platform-independent format. STAToolkit, including full documentation and test cases, is freely available open source via http://neuroanalysis.org, maintained as a resource for the computational neuroscience and neuroinformatics communities. Use cases drawn from somatosensory and gustatory neurophysiology, and community use of STAToolkit, demonstrate its utility and scope.
Journal of Neuroscience Methods | 2007
Jonathan D. Victor; David H. Goldberg; Daniel Gardner
Cost-based metrics formalize notions of distance, or dissimilarity, between two spike trains, and are applicable to single- and multineuronal responses. As such, these metrics have been used to characterize neural variability and neural coding. By examining the structure of an efficient algorithm [Aronov D, 2003. Fast algorithm for the metric-space analysis of simultaneous responses of multiple single neurons. J Neurosci Methods 124(2), 175-79] implementing a metric for multineuronal responses, we determine criteria for its generalization, and identify additional efficiencies that are applicable when related dissimilarity measures are computed in parallel. The generalized algorithm provides the means to test a wide range of coding hypotheses.
ACM Transactions on Sensor Networks | 2006
David H. Goldberg; Andreas G. Andreou; Pedro Julián; Philippe O. Pouliquen; Laurence Riddle; Rich Rosasco
We present a low-power VLSI wake-up detector for a sensor network that uses acoustic signals to localize ground-based vehicles. The detection criterion is the degree of low-frequency periodicity in the acoustic signal, and the periodicity is computed from the “bumpiness” of the autocorrelation of a one-bit version of the signal. We then describe a CMOS ASIC that implements the periodicity estimation algorithm. The ASIC is fully functional and its core consumes 835 nanowatts. It was integrated into an acoustic enclosure and deployed in field tests with synthesized sounds and ground-based vehicles.
international symposium on circuits and systems | 2003
Pedro Julián; Andreas G. Andreou; Pablo Sergio Mandolesi; David H. Goldberg
We present the design and testing of a micropower integrated circuit for the estimation of the bearing angle of a sound source with respect to a pair of microphones. The algorithm is based on a modified binary cross-correlation approach suitable for low-power operation. The circuit has been tested and operates at 600 /spl mu/W at performance levels matching theoretical simulations.
IEEE Transactions on Very Large Scale Integration Systems | 2006
Pedro Julián; Andreas G. Andreou; David H. Goldberg
We present a CMOS integrated circuit (IC) for bearing estimation in the low-audio range that performs a correlation derivative approach in a 0.35-/spl mu/m technology. The IC calculates the bearing angle of a sound source with a mean variance of one degree in a 360/spl deg/ range using four microphones: one pair is used to produce the indication and the other to define the quadrant. An adaptive algorithm decides which pair to use depending on the direction of the incoming signal, in such a way to obtain the best estimate. The IC contains two blocks with 104 stages each. Every stage has a delay unit, a block to reduce the clock speed, and a 10-bit UP/DN counter. The IC measures 2 mm by 2.4 mm, and dissipates 600 /spl mu/W at 3.3 V and 200 kHz. It is purely digital and uses a one-bit quantization of the input signals.
Neuroinformatics | 2008
Daniel Gardner; David H. Goldberg; Bernice Grafstein; Adrian Robert; Esther P. Gardner
The Neuroscience Information Framework (NIF), developed for the NIH Blueprint for Neuroscience Research and available at http://nif.nih.gov and http://neurogateway.org, is built upon a set of coordinated terminology components enabling data and web-resource description and selection. Core NIF terminologies use a straightforward syntax designed for ease of use and for navigation by familiar web interfaces, and readily exportable to aid development of relational-model databases for neuroscience data sharing. Datasets, data analysis tools, web resources, and other entities are characterized by multiple descriptors, each addressing core concepts, including data type, acquisition technique, neuroanatomy, and cell class. Terms for each concept are organized in a tree structure, providing is-a and has-a relations. Broad general terms near each root span the category or concept and spawn more detailed entries for specificity. Related but distinct concepts (e.g., brain area and depth) are specified by separate trees, for easier navigation than would be required by graph representation. Semantics enabling NIF data discovery were selected at one or more workshops by investigators expert in particular systems (vision, olfaction, behavioral neuroscience, neurodevelopment), brain areas (cerebellum, thalamus, hippocampus), preparations (molluscs, fly), diseases (neurodegenerative disease), or techniques (microscopy, computation and modeling, neurogenetics). Workshop-derived integrated term lists are available Open Source at http://brainml.org; a complete list of participants is at http://brainml.org/workshops.
Neurocomputing | 2003
David H. Goldberg; Arun P. Sripati; Andreas G. Andreou
Abstract We examine the spiking axon as a communication channel. We develop a first principles channel model that encompasses the noise in the axon, which manifests itself as spike jitter, and the power consumption, which arises from the activity of the Na + –K + pump. This model enables us to examine the trade-off between the information rate and power consumption. Using parameters from the frog myelinated axon, we determine the spike rate that corresponds to the maximum energy efficiency. This spike rate is consistent with experimental observations, which suggests that neural communication may have developed to maximize energy efficiency rather than information rate alone.