Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Susanne Schreiber is active.

Publication


Featured researches published by Susanne Schreiber.


Biophysical Journal | 2002

Transbilayer movement of phospholipids at the main phase transition of lipid membranes: implications for rapid flip-flop in biological membranes.

Karin John; Susanne Schreiber; Janek Kubelt; Andreas Herrmann; Peter Müller

The transbilayer movement of fluorescent phospholipid analogs in liposomes was studied at the lipid phase transition of phospholipid membranes. Two NBD-labeled analogs were used, one bearing the fluorescent moiety at a short fatty acid chain in the sn-2 position (C(6)-NBD-PC) and one headgroup-labeled analog having two long fatty acyl chains (N-NBD-PE). The transbilayer redistribution of the analogs was assessed by a dithionite-based assay. We observed a drastic increase of the transbilayer movement of both analogs at the lipid phase transition of DPPC (T(c) = 41 degrees C) and DMPC (T(c) = 23 degrees C). The flip-flop of analogs was fast at the T(c) of DPPC with a half-time (t(1/2)) of ~6-10 min and even faster at the T(c) of DMPC with t(1/2) on the order of <2 min, as shown for C(6)-NBD-PC. Suppressing the phase transition by the addition of cholesterol, the rapid transbilayer movement was abolished. Molecular packing defects at the phase transition are assumed to be responsible for the rapid transbilayer movement. The relevance of those defects for understanding of the activity of flippases is discussed.


Frontiers in Computational Neuroscience | 2015

State-dependencies of learning across brain scales

Petra Ritter; Jan Born; Michael Brecht; Hubert R. Dinse; Uwe Heinemann; Burkhard Pleger; Dietmar Schmitz; Susanne Schreiber; Arno Villringer; Richard Kempter

Learning is a complex brain function operating on different time scales, from milliseconds to years, which induces enduring changes in brain dynamics. The brain also undergoes continuous “spontaneous” shifts in states, which, amongst others, are characterized by rhythmic activity of various frequencies. Besides the most obvious distinct modes of waking and sleep, wake-associated brain states comprise modulations of vigilance and attention. Recent findings show that certain brain states, particularly during sleep, are essential for learning and memory consolidation. Oscillatory activity plays a crucial role on several spatial scales, for example in plasticity at a synaptic level or in communication across brain areas. However, the underlying mechanisms and computational rules linking brain states and rhythms to learning, though relevant for our understanding of brain function and therapeutic approaches in brain disease, have not yet been elucidated. Here we review known mechanisms of how brain states mediate and modulate learning by their characteristic rhythmic signatures. To understand the critical interplay between brain states, brain rhythms, and learning processes, a wide range of experimental and theoretical work in animal models and human subjects from the single synapse to the large-scale cortical level needs to be integrated. By discussing results from experiments and theoretical approaches, we illuminate new avenues for utilizing neuronal learning mechanisms in developing tools and therapies, e.g., for stroke patients and to devise memory enhancement strategies for the elderly.


Neural Computation | 2002

Energy-efficient coding with discrete stochastic events

Susanne Schreiber; Christian K. Machens; Andreas V. M. Herz; Simon B. Laughlin

We investigate the energy efficiency of signaling mechanisms that transfer information by means of discrete stochastic events, such as the opening or closing of an ion channel. Using a simple model for the generation of graded electrical signals by sodium and potassium channels, we find optimum numbers of channels that maximize energy efficiency. The optima depend on several factors: the relative magnitudes of the signaling cost (current flow through channels), the fixed cost of maintaining the system, the reliability of the input, additional sources of noise, and the relative costs of upstream and downstream mechanisms. We also analyze how the statistics of input signals influence energy efficiency. We find that energy-efficient signal ensembles favor a bimodal distribution of channel activations and contain only a very small fraction of large inputs when energy is scarce. We conclude that when energy use is a significant constraint, trade-offs between information transfer and energy can strongly influence the number of signaling molecules and synapses used by neurons and the manner in which these mechanisms represent information.


Proceedings of the National Academy of Sciences of the United States of America | 2011

Efficient transformation of an auditory population code in a small sensory system

Jan Clemens; Olaf Kutzki; Bernhard Ronacher; Susanne Schreiber; Sandra Wohlgemuth

Optimal coding principles are implemented in many large sensory systems. They include the systematic transformation of external stimuli into a sparse and decorrelated neuronal representation, enabling a flexible readout of stimulus properties. Are these principles also applicable to size-constrained systems, which have to rely on a limited number of neurons and may only have to fulfill specific and restricted tasks? We studied this question in an insect system—the early auditory pathway of grasshoppers. Grasshoppers use genetically fixed songs to recognize mates. The first steps of neural processing of songs take place in a small three-layer feed-forward network comprising only a few dozen neurons. We analyzed the transformation of the neural code within this network. Indeed, grasshoppers create a decorrelated and sparse representation, in accordance with optimal coding theory. Whereas the neuronal input layer is best read out as a summed population, a labeled-line population code for temporal features of the song is established after only two processing steps. At this stage, information about song identity is maximal for a population decoder that preserves neuronal identity. We conclude that optimal coding principles do apply to the early auditory system of the grasshopper, despite its size constraints. The inputs, however, are not encoded in a systematic, map-like fashion as in many larger sensory systems. Already at its periphery, part of the grasshopper auditory system seems to focus on behaviorally relevant features, and is in this property more reminiscent of higher sensory areas in vertebrates.


Proceedings of the National Academy of Sciences of the United States of America | 2012

Grid cells in rat entorhinal cortex encode physical space with independent firing fields and phase precession at the single-trial level

Eric T. Reifenstein; Richard Kempter; Susanne Schreiber; Martin Stemmler; Andreas V. M. Herz

When a rat moves, grid cells in its entorhinal cortex become active in multiple regions of the external world that form a hexagonal lattice. As the animal traverses one such “firing field,” spikes tend to occur at successively earlier theta phases of the local field potential. This phenomenon is called phase precession. Here, we show that spike phases provide 80% more spatial information than spike counts and that they improve position estimates from single neurons down to a few centimeters. To understand what limits the resolution and how variable spike phases are across different field traversals, we analyze spike trains run by run. We find that the multiple firing fields of a grid cell operate as independent elements for encoding physical space. In addition, phase precession is significantly stronger than the pooled-run data suggest. Despite the inherent stochasticity of grid-cell firing, phase precession is therefore a robust phenomenon at the single-trial level, making a theta-phase code for spatial navigation feasible.


PLOS Computational Biology | 2016

Inhibition as a Binary Switch for Excitatory Plasticity in Pyramidal Neurons

Katharina Anna Wilmes; Henning Sprekeler; Susanne Schreiber

Synaptic plasticity is thought to induce memory traces in the brain that are the foundation of learning. To ensure the stability of these traces in the presence of further learning, however, a regulation of plasticity appears beneficial. Here, we take up the recent suggestion that dendritic inhibition can switch plasticity of excitatory synapses on and off by gating backpropagating action potentials (bAPs) and calcium spikes, i.e., by gating the coincidence signals required for Hebbian forms of plasticity. We analyze temporal and spatial constraints of such a gating and investigate whether it is possible to suppress bAPs without a simultaneous annihilation of the forward-directed information flow via excitatory postsynaptic potentials (EPSPs). In a computational analysis of conductance-based multi-compartmental models, we demonstrate that a robust control of bAPs and calcium spikes is possible in an all-or-none manner, enabling a binary switch of coincidence signals and plasticity. The position of inhibitory synapses on the dendritic tree determines the spatial extent of the effect and allows a pathway-specific regulation of plasticity. With appropriate timing, EPSPs can still trigger somatic action potentials, although backpropagating signals are abolished. An annihilation of bAPs requires precisely timed inhibition, while the timing constraints are less stringent for distal calcium spikes. We further show that a wide-spread motif of local circuits—feedforward inhibition—is well suited to provide the temporal precision needed for the control of bAPs. Altogether, our model provides experimentally testable predictions and demonstrates that the inhibitory switch of plasticity can be a robust and attractive mechanism, hence assigning an additional function to the inhibitory elements of neuronal microcircuits beyond modulation of excitability.


PLOS ONE | 2013

Somatic versus Dendritic Resonance: Differential Filtering of Inputs through Non-Uniform Distributions of Active Conductances

Ekaterina Zhuchkova; Michiel W. H. Remme; Susanne Schreiber

Synaptic inputs to neurons are processed in a frequency-dependent manner, with either low-pass or resonant response characteristics. These types of filtering play a key role in the frequency-specific information flow in neuronal networks. While the generation of resonance by specific ionic conductances is well investigated, less attention has been paid to the spatial distribution of the resonance-generating conductances across a neuron. In pyramidal neurons – one of the major excitatory cell-types in the mammalian brain – a steep gradient of resonance-generating h-conductances with a 60-fold increase towards distal dendrites has been demonstrated experimentally. Because the dendritic trees of these cells are large, spatial compartmentalization of resonant properties can be expected. Here, we use mathematical descriptions of spatially extended neurons to investigate the consequences of such a distal, dendritic localization of h-conductances for signal processing. While neurons with short dendrites do not exhibit a pronounced compartmentalization of resonance, i.e. the filter properties of dendrites and soma are similar, we find that neurons with longer dendrites ( space constant) can show distinct filtering of dendritic and somatic inputs due to electrotonic segregation. Moreover, we show that for such neurons, experimental classification as resonant versus nonresonant can be misleading when based on somatic recordings, because for these morphologies a dendritic resonance could easily be undetectable when using somatic input. Nevertheless, noise-driven membrane-potential oscillations caused by dendritic resonance can propagate to the soma where they can be recorded, hence contrasting with the low-pass filtering at the soma. We conclude that non-uniform distributions of active conductances can underlie differential filtering of synaptic input in neurons with spatially extended dendrites, like pyramidal neurons, bearing relevance for the localization-dependent targeting of synaptic input pathways to these cells.


eLife | 2014

Cell-intrinsic mechanisms of temperature compensation in a grasshopper sensory receptor neuron

Frederic A Roemschied; Monika J. B. Eberhard; Jan-Hendrik Schleimer; Bernhard Ronacher; Susanne Schreiber

Changes in temperature affect biochemical reaction rates and, consequently, neural processing. The nervous systems of poikilothermic animals must have evolved mechanisms enabling them to retain their functionality under varying temperatures. Auditory receptor neurons of grasshoppers respond to sound in a surprisingly temperature-compensated manner: firing rates depend moderately on temperature, with average Q10 values around 1.5. Analysis of conductance-based neuron models reveals that temperature compensation of spike generation can be achieved solely relying on cell-intrinsic processes and despite a strong dependence of ion conductances on temperature. Remarkably, this type of temperature compensation need not come at an additional metabolic cost of spike generation. Firing rate-based information transfer is likely to increase with temperature and we derive predictions for an optimal temperature dependence of the tympanal transduction process fostering temperature compensation. The example of auditory receptor neurons demonstrates how neurons may exploit single-cell mechanisms to cope with multiple constraints in parallel. DOI: http://dx.doi.org/10.7554/eLife.02078.001


Cell Reports | 2016

Cell Type-Specific Differences in Spike Timing and Spike Shape in the Rat Parasubiculum and Superficial Medial Entorhinal Cortex

Christian Laut Ebbesen; Eric T. Reifenstein; Qiusong Tang; Andrea Burgalossi; Saikat Ray; Susanne Schreiber; Richard Kempter; Michael Brecht

Summary The medial entorhinal cortex (MEC) and the adjacent parasubiculum are known for their elaborate spatial discharges (grid cells, border cells, etc.) and the precessing of spikes relative to the local field potential. We know little, however, about how spatio-temporal firing patterns map onto cell types. We find that cell type is a major determinant of spatio-temporal discharge properties. Parasubicular neurons and MEC layer 2 (L2) pyramids have shorter spikes, discharge spikes in bursts, and are theta-modulated (rhythmic, locking, skipping), but spikes phase-precess only weakly. MEC L2 stellates and layer 3 (L3) neurons have longer spikes, do not discharge in bursts, and are weakly theta-modulated (non-rhythmic, weakly locking, rarely skipping), but spikes steeply phase-precess. The similarities between MEC L3 neurons and MEC L2 stellates on one hand and parasubicular neurons and MEC L2 pyramids on the other hand suggest two distinct streams of temporal coding in the parahippocampal cortex.


Biophysical Journal | 2001

Stochastic Simulation of Hemagglutinin-Mediated Fusion Pore Formation

Susanne Schreiber; Kai Ludwig; Andreas Herrmann; Hermann-Georg Holzhütter

Studies on fusion between cell pairs have provided evidence that opening and subsequent dilation of a fusion pore are stochastic events. Therefore, adequate modeling of fusion pore formation requires a stochastic approach. Here we present stochastic simulations of hemagglutinin (HA)-mediated fusion pore formation between HA-expressing cells and erythrocytes based on numerical solutions of a master equation. The following elementary processes are taken into account: 1) lateral diffusion of HA-trimers and receptors, 2) aggregation of HA-trimers to immobilized clusters, 3) reversible formation of HA-receptor contacts, and 4) irreversible conversion of HA-receptor contacts into stable links between HA and the target membrane. The contact sites between fusing cells are modeled as superimposed square lattices. The model simulates well the statistical distribution of time delays measured for the various intermediates of fusion pore formation between cell-cell fusion complexes. In particular, these are the formation of small ion-permissive and subsequent lipid-permissive fusion pores detected experimentally (R. Blumenthal, D. P. Sarkar, S. Durell, D. E. Howard, and S. J., J. Cell Biol. 135:63-71). Moreover, by averaging the simulated individual stochastic time courses across a larger population of cell-cell-complexes the model also provides a reasonable description of kinetic measurements on lipid mixing in cell suspensions (T. Danieli, S. L. Pelletier, Y.I. Henis, and J. M. White, 1996, J. Cell Biol. 133:559-569).

Collaboration


Dive into the Susanne Schreiber's collaboration.

Top Co-Authors

Avatar

Jan-Hendrik Schleimer

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Bernhard Ronacher

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Eric T. Reifenstein

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Janina Hesse

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Richard Kempter

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Michiel W. H. Remme

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Monika J. B. Eberhard

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Andreas Herrmann

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Ekaterina Zhuchkova

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Frederic A Roemschied

Humboldt University of Berlin

View shared research outputs
Researchain Logo
Decentralizing Knowledge