Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jannis Schuecker is active.

Publication


Featured researches published by Jannis Schuecker.


Physical Review E | 2015

Modulated escape from a metastable state driven by colored noise

Jannis Schuecker; Markus Diesmann; Moritz Helias

Many phenomena in nature are described by excitable systems driven by colored noise. The temporal correlations in the fluctuations hinder an analytical treatment. We here present a general method of reduction to a white-noise system, capturing the color of the noise by effective and time-dependent boundary conditions. We apply the formalism to a model of the excitability of neuronal membranes, the leaky integrate-and-fire neuron model, revealing an analytical expression for the linear response of the system valid up to moderate frequencies. The closed form analytical expression enables the characterization of the response properties of such excitable units and the assessment of oscillations emerging in networks thereof.


PLOS Computational Biology | 2017

Fundamental Activity Constraints Lead to Specific Interpretations of the Connectome

Jannis Schuecker; Maximilian Schmidt; Sacha J. van Albada; Markus Diesmann; Moritz Helias

The continuous integration of experimental data into coherent models of the brain is an increasing challenge of modern neuroscience. Such models provide a bridge between structure and activity, and identify the mechanisms giving rise to experimental observations. Nevertheless, structurally realistic network models of spiking neurons are necessarily underconstrained even if experimental data on brain connectivity are incorporated to the best of our knowledge. Guided by physiological observations, any model must therefore explore the parameter ranges within the uncertainty of the data. Based on simulation results alone, however, the mechanisms underlying stable and physiologically realistic activity often remain obscure. We here employ a mean-field reduction of the dynamics, which allows us to include activity constraints into the process of model construction. We shape the phase space of a multi-scale network model of the vision-related areas of macaque cortex by systematically refining its connectivity. Fundamental constraints on the activity, i.e., prohibiting quiescence and requiring global stability, prove sufficient to obtain realistic layer- and area-specific activity. Only small adaptations of the structure are required, showing that the network operates close to an instability. The procedure identifies components of the network critical to its collective dynamics and creates hypotheses for structural data and future experiments. The method can be applied to networks involving any neuron model with a known gain function.


Frontiers in Neuroinformatics | 2017

Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator

Jan Hahne; David Dahmen; Jannis Schuecker; Andreas Frommer; Matthias Bolten; Moritz Helias; Markus Diesmann

Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.


BMC Neuroscience | 2015

Identifying and exploiting the anatomical origin of population rate oscillations in multi-layered spiking networks

Hannah Bos; Jannis Schuecker; Markus Diesmann; Moritz Helias

Fast oscillations of the population firing rate in the high gamma range (50-200 Hz), where individual neurons fire slowly and irregularly, are observed in the living brain and in network models of leaky integrate-and-fire (LIF) neurons, that have also been studied analytically [1]. However, a systematic approach identifying sub-circuits responsible for specific oscillations in a structured network of neural populations is currently unavailable. We consider a large-scale, neural network consisting of 4 layers each composed of an excitatory and inhibitory population of LIF-neurons with connectivity determined by electrophysiological and anatomical studies [2]. In simulations we observe a peak in the power spectrum around 83 Hz in all populations and low frequency oscillations with smaller power in a subset of the populations. Mapping the dynamics of the fluctuations to an effective linear rate model, using the recently derived transfer function for LIF-neurons with synaptic filtering [3], we derive the spectra of the population firing rates analytically. Decomposing the noise-driven fluctuations into eigenmodes of the effective connectivity, we identify the modes responsible for peaks in the spectra. Applying perturbation theory, we quantify the influence of individual anatomical connections on the spectrum at given frequencies and identify a sub-circuitry, localized in the supra-granular and granular layer, generating the oscillation. These findings are in agreement with layer-specific local field potential measurements in the Macaque primary visual cortex, where gamma-frequency oscillations were mostly pronounced in layer 2,3 and 4B [4]. We exploit this method i) to identify the connectivity loops responsible for the observed peaks and ii) to alter the circuitry in a targeted manner to control the position and amplitude of the peaks and the generation of slow frequency fluctuations. This requires removal and addition of only small numbers of synapses. The analytical framework moreover explains the suppression of higher frequencies by distributed delays and the amplification of population specific oscillatory input. Mapping the stimulus vector onto the eigenmodes of the system shows how the components of the input vector are processed in the network. Thus one can derive the sensitivity of the population rate dynamics to the direction and frequency of stimuli. Our method finds application in the identification of the connectivity loops that determine emergent and externally driven global measures of activity observable in experiments as well as in engineering circuits that exhibit desired correlations on the population level.


BMC Neuroscience | 2014

The transfer function of the LIF model: from white to filtered noise.

Jannis Schuecker; Markus Diesmann; Moritz Helias

The theory describing correlated activity emerging in recurrent networks relies on the single neuron response to a modulation of its input, i.e. the transfer function. For the leaky integrate-and-fire neuron model exposed to unfiltered synaptic noise the transfer function can be derived analytically [1,2]. In this context the effect of synaptic filtering on the response properties has also been studied intensively at the beginning of the last decade [3,4]. Analytical results were derived in the low as well as in the high frequency limit. The main finding is that the linear response amplitude of model neurons exposed to filtered synaptic noise does not decay to zero in the high frequency limit. A numerical method has also been developed to study the influence of synaptic noise on the response properties [5]. Here we first revisit the transfer function for neuron models without synaptic filtering and simplify the derivation exploiting analogies between the one dimensional Fokker-Planck equation and the quantum harmonic oscillator. We treat the problem of synaptic filtering with short time constants by reducing the corresponding two dimensional Fokker-Planck equation to one dimension with effective boundary conditions [6]. To this end we use the static and dynamic boundary conditions derived earlier by a perturbative treatment of the arising boundary layer problem [4]. Finally we compare the analytical results to direct simulations (Fig.​(Fig.1)1) and observe that the approximations are valid up to frequencies in the gamma range (60-80 Hz). Deviations are explained by the nature of the approximations. Figure 1 A Linear response amplitude for neurons exposed to colored (red) and white (black) noise. Simulations (dots) and analytical results (curves). B Phase shift of linear response.


Archive | 2017

Nest 2.12.0

Susanne Kunkel; Rajalekshmi Deepu; Hans E. Plesser; Bruno Golosio; Mikkel Elle Lepperød; Jochen Martin Eppler; Sepehr Mahmoudian; Jan Hahne; Dimitri Plotnikov; Claudia Bachmann; Alexander Peyser; Tanguy Fardet; Till Schumann; Jakob Jordan; Ankur Sinha; Oliver Breitwieser; Abigail Morrison; Tammo Ippen; Hendrik Rothe; Steffen Graber; Hesam Setareh; Jesús Garrido; Dennis Terhorst; Alexey Shusharin; Hannah Bos; Arjun Rao; Alex Seeholzer; Mikael Djurfeldt; Maximilian Schmidt; Stine Brekke Vennemo


arXiv: Neurons and Cognition | 2016

Noise dynamically suppresses chaos in neural networks

Sven Goedeke; Jannis Schuecker; Moritz Helias


arXiv: Neurons and Cognition | 2016

Noise dynamically suppresses chaos in random neural networks

Sven Goedeke; Jannis Schuecker; Moritz Helias


arXiv: Disordered Systems and Neural Networks | 2016

Functional methods for disordered neural networks

Jannis Schuecker; Sven Goedeke; David Dahmen; Moritz Helias


Archive | 2015

NEST 2.8.0

Jochen Martin Eppler; Rajalekshmi Deepu; Claudia Bachmann; Tiziano Zito; Alexander Peyser; Jakob Jordan; Robin Pauli; Luis Riquelme; Sacha J. van Albada; Abigail Morrison; Tammo Ippen; Moritz Helias; Hesam Setareh; Marc-Oliver Gewaltig; Hannah Bos; Frank Michler; Ali Shirvani; Renato Duarte; Maximilian Schmidt; Espen Hagen; Jannis Schuecker; Wolfram Schenck; Moritz Deger; Hans E. Plesser; Susanne Kunkel; Johanna Senk

Collaboration


Dive into the Jannis Schuecker's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Dahmen

Allen Institute for Brain Science

View shared research outputs
Top Co-Authors

Avatar

Jan Hahne

University of Wuppertal

View shared research outputs
Top Co-Authors

Avatar

Hannah Bos

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jakob Jordan

Allen Institute for Brain Science

View shared research outputs
Top Co-Authors

Avatar

Sacha J. van Albada

Allen Institute for Brain Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge