Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David B. Kastner is active.

Publication


Featured researches published by David B. Kastner.


Cell | 2004

Structure of ero1p, source of disulfide bonds for oxidative protein folding in the cell.

Einav Gross; David B. Kastner; Chris A. Kaiser; Deborah Fass

The flavoenzyme Ero1p produces disulfide bonds for oxidative protein folding in the endoplasmic reticulum. Disulfides generated de novo within Ero1p are transferred to protein disulfide isomerase and then to substrate proteins by dithiol-disulfide exchange reactions. Despite this key role of Ero1p, little is known about the mechanism by which this enzyme catalyzes thiol oxidation. Here, we present the X-ray crystallographic structure of Ero1p, which reveals the molecular details of the catalytic center, the role of a CXXCXXC motif, and the spatial relationship between functionally significant cysteines and the bound cofactor. Remarkably, the Ero1p active site closely resembles that of the versatile thiol oxidase module of Erv2p, a protein with no sequence homology to Ero1p. Furthermore, both Ero1p and Erv2p display essential dicysteine motifs on mobile polypeptide segments, suggesting that shuttling electrons to a rigid active site using a flexible strand is a fundamental feature of disulfide-generating flavoenzymes.


Nature Neuroscience | 2011

Coordinated dynamic encoding in the retina using opposing forms of plasticity.

David B. Kastner; Stephen A. Baccus

The range of natural inputs encoded by a neuron often exceeds its dynamic range. To overcome this limitation, neural populations divide their inputs among different cell classes, as with rod and cone photoreceptors, and adapt by shifting their dynamic range. We report that the dynamic behavior of retinal ganglion cells in salamanders, mice and rabbits is divided into two opposing forms of short-term plasticity in different cell classes. One population of cells exhibited sensitization—a persistent elevated sensitivity following a strong stimulus. This newly observed dynamic behavior compensates for the information loss caused by the known process of adaptation occurring in a separate cell population. The two populations divide the dynamic range of inputs, with sensitizing cells encoding weak signals and adapting cells encoding strong signals. In the two populations, the linear, threshold and adaptive properties are linked to preserve responsiveness when stimulus statistics change, with one population maintaining the ability to respond when the other fails.


Neuron | 2013

Spatial segregation of adaptation and predictive sensitization in retinal ganglion cells.

David B. Kastner; Stephen A. Baccus

Sensory systems change their sensitivity based on recent stimuli to adjust their response range to the range of inputs and to predict future sensory input. Here, we report the presence of retinal ganglion cells that have antagonistic plasticity, showing central adaptation and peripheral sensitization. Ganglion cell responses were captured by a spatiotemporal model with independently adapting excitatory and inhibitory subunits, and sensitization requires GABAergic inhibition. Using a simple theory of signal detection, we show that the sensitizing surround conforms to an optimal inference model that continually updates the prior signal probability. This indicates that small receptive field regions have dual functionality--to adapt to the local range of signals but sensitize based upon the probability of the presence of that signal. Within this framework, we show that sensitization predicts the location of a nearby object, revealing prediction as a functional role for adapting inhibition in the nervous system.


Proceedings of the National Academy of Sciences of the United States of America | 2015

Critical and maximally informative encoding between neural populations in the retina

David B. Kastner; Stephen A. Baccus; Tatyana O. Sharpee

Significance It is unknown what functional properties influence the number of cell types in the brain. Here we show how one can use a powerful framework from physics that describes the transitions between different phases of matter, such as between liquid and gas, to specify under what conditions it becomes optimal to split neural populations into new subtypes to maximize information transmission. These results outline a conceptual framework that spans both physical and biological systems and can be used to explain the emergence of different functional classes of neuronal types. Computation in the brain involves multiple types of neurons, yet the organizing principles for how these neurons work together remain unclear. Information theory has offered explanations for how different types of neurons can maximize the transmitted information by encoding different stimulus features. However, recent experiments indicate that separate neuronal types exist that encode the same filtered version of the stimulus, but then the different cell types signal the presence of that stimulus feature with different thresholds. Here we show that the emergence of these neuronal types can be quantitatively described by the theory of transitions between different phases of matter. The two key parameters that control the separation of neurons into subclasses are the mean and standard deviation (SD) of noise affecting neural responses. The average noise across the neural population plays the role of temperature in the classic theory of phase transitions, whereas the SD is equivalent to pressure or magnetic field, in the case of liquid–gas and magnetic transitions, respectively. Our results account for properties of two recently discovered types of salamander Off retinal ganglion cells, as well as the absence of multiple types of On cells. We further show that, across visual stimulus contrasts, retinal circuits continued to operate near the critical point whose quantitative characteristics matched those expected near a liquid–gas critical point and described by the nearest-neighbor Ising model in three dimensions. By operating near a critical point, neural circuits can maximize information transmission in a given environment while retaining the ability to quickly adapt to a new environment.


Current Opinion in Neurobiology | 2014

Insights from the retina into the diverse and general computations of adaptation, detection, and prediction.

David B. Kastner; Stephen A. Baccus

The retina performs a diverse set of complex, nonlinear, computations, beyond the simple linear photoreceptor weighting assumed in the classical understanding of ganglion cell receptive fields. Here we attempt to organize these computations and extract rules that correspond to three distinct goals of early sensory systems. First, the retina acts efficiently to transmit information to the higher brain for further processing. We observe that although the retina adapts to a number of complex statistics, many of these may be explained by local adaptation to the mean signal strength at that stage. Second, ganglion cells signal the detection of a diverse set of features. Recent results indicate that feature selectivity arises through the action of specific inhibition, rather than through the convergence of excitation as in classical cortical models. Finally, the retina conveys predictions about the stimulus, a function usually attributed only to the higher brain. We expect that computational and mechanistic rules associated with these classes of functions will be an important guide in the study of other neural circuits.


Frontiers in Neuroscience | 2016

A Model of Synaptic Reconsolidation

David B. Kastner; Tilo Schwalger; Lorric Ziegler; Wulfram Gerstner

Reconsolidation of memories has mostly been studied at the behavioral and molecular level. Here, we put forward a simple extension of existing computational models of synaptic consolidation to capture hippocampal slice experiments that have been interpreted as reconsolidation at the synaptic level. The model implements reconsolidation through stabilization of consolidated synapses by stabilizing entities combined with an activity-dependent reservoir of stabilizing entities that are immune to protein synthesis inhibition (PSI). We derive a reduced version of our model to explore the conditions under which synaptic reconsolidation does or does not occur, often referred to as the boundary conditions of reconsolidation. We find that our computational model of synaptic reconsolidation displays complex boundary conditions. Our results suggest that a limited resource of hypothetical stabilizing molecules or complexes, which may be implemented by protein phosphorylation or different receptor subtypes, can underlie the phenomenon of synaptic reconsolidation.


PLOS Computational Biology | 2018

Inferring hidden structure in multilayered neural circuits

Niru Maheswaranathan; David B. Kastner; Stephen A. Baccus; Surya Ganguli

A central challenge in sensory neuroscience involves understanding how neural circuits shape computations across cascaded cell layers. Here we attempt to reconstruct the response properties of experimentally unobserved neurons in the interior of a multilayered neural circuit, using cascaded linear-nonlinear (LN-LN) models. We combine non-smooth regularization with proximal consensus algorithms to overcome difficulties in fitting such models that arise from the high dimensionality of their parameter space. We apply this framework to retinal ganglion cell processing, learning LN-LN models of retinal circuitry consisting of thousands of parameters, using 40 minutes of responses to white noise. Our models demonstrate a 53% improvement in predicting ganglion cell spikes over classical linear-nonlinear (LN) models. Internal nonlinear subunits of the model match properties of retinal bipolar cells in both receptive field structure and number. Subunits have consistently high thresholds, supressing all but a small fraction of inputs, leading to sparse activity patterns in which only one subunit drives ganglion cell spiking at any time. From the model’s parameters, we predict that the removal of visual redundancies through stimulus decorrelation across space, a central tenet of efficient coding theory, originates primarily from bipolar cell synapses. Furthermore, the composite nonlinear computation performed by retinal circuitry corresponds to a boolean OR function applied to bipolar cell feature detectors. Our methods are statistically and computationally efficient, enabling us to rapidly learn hierarchical non-linear models as well as efficiently compute widely used descriptive statistics such as the spike triggered average (STA) and covariance (STC) for high dimensional stimuli. This general computational framework may aid in extracting principles of nonlinear hierarchical sensory processing across diverse modalities from limited data.


bioRxiv | 2017

Synchronous inhibitory pathways create both efficiency and diversity in the retina

Mihai Manu; Lane T. McIntosh; David B. Kastner; Benjamin Naecker; Stephen A. Baccus

Visual information is conveyed from the retina to the brain by a diverse set of retinal ganglion cells. Although they have differing nonlinear properties, nearly all ganglion cell receptive fields on average compute a difference in intensity across space and time using a region known as the classical or linear surround1,2, a property that improves information transmission about natural visual scenes3,4. The spatiotemporal visual features that create this fundamental property have not been quantitatively assigned to specific interneurons. Here we describe a generalizable causal approach using simultaneous intracellular and multielectrode recording to directly measure and manipulate the sensory feature conveyed by a neural pathway to a downstream neuron. Analyzing two inhibitory cell classes, horizontal cells and linear amacrine cells, we find that rather than transmitting different temporal features, the two inhibitory pathways act synchronously to create the salamander ganglion cell surround at different spatial scales. Using these measured visual features and theories of efficient coding, we computed a fitness landscape representing the information transmitted using different weightings of the two inhibitory pathways. This theoretical landscape revealed a ridge that maintains near-optimal information transmission while allowing for receptive field diversity. The ganglion cell population showed a striking match to this prediction, concentrating along this ridge across a wide range of positions using different weightings of amacrine or horizontal cell visual features. These results show how parallel neural pathways synthesize a sensory computation, and why this architecture achieves the potentially competing objectives of high information transmission of individual ganglion cells, and diversity among receptive fields.


bioRxiv | 2017

Adaptive feature detection from differential processing in parallel retinal pathways

Yusuf Ozuysal; David B. Kastner; Stephen A. Baccus

To transmit information efficiently in a changing environment, the retina adapts to visual contrast by adjusting its gain, latency and mean response. Additionally, the temporal frequency selectivity, or bandwidth changes to encode the absolute intensity when the stimulus environment is noisy, and intensity differences when noise is low. We show that the On pathway of On-Off retinal amacrine and ganglion cells is required to change temporal bandwidth but not other adaptive properties. This remarkably specific adaptive mechanism arises from differential effects of contrast on the On and Off pathways. We analyzed a biophysical model fit only to a cells membrane potential, and verified pharmacologically that it accurately revealed the two pathways. We conclude that changes in bandwidth arise mostly from differences in synaptic threshold in the two pathways, rather than differences in synaptic release dynamics. Different efficient codes are selected by different thresholds in two independently adapting neural pathways.


bioRxiv | 2017

Two-stage adaptation of inhibition mediates predictive sensitization in the retina

David B. Kastner; Georgia Panagiotakos; Yusuf Ozuysal; Stephen A. Baccus

A critical function of the nervous system is the prediction of future sensory input. One such predictive computation is retinal sensitization, a form of short-term plasticity seen in multiple species that elevates local sensitivity following strong local stimulation. Here we perform a causal circuit analysis of retinal sensitization using simultaneous intracellular and multielectrode recording in the salamander. We show, using direct current injection into inhibitory sustained amacrine cells that a decrease in amacrine transmission is necessary, sufficient and occurs at the right time and manner to cause sensitization in ganglion cells. Because of neural dynamics and nonlinear pathways, a computational model is essential to explain how a change in steady inhibitory transmission causes sensitization. Whereas adaptation of excitation removes an expected result in order to transmit novelty, adaptation of inhibition provides a general mechanism to enhance the sensitivity to the sensory feature conveyed by an inhibitory pathway, creating a prediction of future input.

Collaboration


Dive into the David B. Kastner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tatyana O. Sharpee

Salk Institute for Biological Studies

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris A. Kaiser

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge