Marc-Oliver Gewaltig
École Polytechnique Fédérale de Lausanne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Marc-Oliver Gewaltig.
Nature | 1999
Markus Diesmann; Marc-Oliver Gewaltig; Ad Aertsen
The classical view of neural coding has emphasized the importance of information carried by the rate at which neurons discharge action potentials. More recent proposals that information may be carried by precise spike timing have been challenged by the assumption that these neurons operate in a noisy fashion—presumably reflecting fluctuations in synaptic input—and, thus, incapable of transmitting signals with millisecond fidelity. Here we show that precisely synchronized action potentials can propagate within a model of cortical network activity that recapitulates many of the features of biological systems. An attractor, yielding a stable spiking precision in the (sub)millisecond range, governs the dynamics of synchronization. Our results indicate that a combinatorial neural code, based on rapid associations of groups of neurons co-ordinating their activity at the single spike level, is possible within a cortical-like network.
PLOS Computational Biology | 2009
Eilen Nordlie; Marc-Oliver Gewaltig; Hans E. Plesser
Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing—and thinking about—complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.
Frontiers in Neuroinformatics | 2008
Jochen Martin Eppler; Moritz Helias; Eilif Muller; Markus Diesmann; Marc-Oliver Gewaltig
The neural simulation tool NEST (http://www.nest-initiative.org) is a simulator for heterogeneous networks of point neurons or neurons with a small number of compartments. It aims at simulations of large neural systems with more than 104 neurons and 107 to 109 synapses. NEST is implemented in C++ and can be used on a large range of architectures from single-core laptops over multi-core desktop computers to super-computers with thousands of processor cores. Python (http://www.python.org) is a modern programming language that has recently received considerable attention in Computational Neuroscience. Python is easy to learn and has many extension modules for scientific computing (e.g. http://www.scipy.org). In this contribution we describe PyNEST, the new user interface to NEST. PyNEST combines NESTs efficient simulation kernel with the simplicity and flexibility of Python. Compared to NESTs native simulation language SLI, PyNEST makes it easier to set up simulations, generate stimuli, and analyze simulation results. We describe how PyNEST connects NEST and Python and how it is implemented. With a number of examples, we illustrate how it is used.
Neural Networks | 1999
Edgar Körner; Marc-Oliver Gewaltig; Ursula Körner; Andreas Richter; Tobias Rodemann
We propose that the specific architecture of the neocortex reflects the organization principles of neocortical computation. In this paper, we place the anatomically defined concept of columns into a functional context. It is provided by a large-scale computational hypothesis on visual recognition, which includes both, rapid parallel forward recognition, independent of any feedback prediction, and a feedback controlled refinement system. Short epochs of periodic clocking define a global reference time and introduce a discrete time for cortical processing which enables the combination of parallel categorization and sequential refinement. The presented model differs significantly from conventional neural network architectures and suggests a novel interpretation of the role of gamma oscillations and cognitive binding.
european conference on parallel processing | 2007
Hans E. Plesser; Jochen Martin Eppler; Abigail Morrison; Markus Diesmann; Marc-Oliver Gewaltig
To understand the principles of information processing in the brain, we depend on models with more than 105 neurons and 109 connections. These networks can be described as graphs of threshold elements that exchange point events over their connections. From the computer science perspective, the key challenges are to represent the connections succinctly; to transmit events and update neuron states efficiently; and to provide a comfortable user interface. We present here the neural simulation tool NEST, a neuronal network simulator which addresses all these requirements. To simulate very large networks with acceptable time and memory requirements, NEST uses a hybrid strategy, combining distributed simulation across cluster nodes (MPI) with thread-based simulation on each computer. Benchmark simulations of a computationally hard biological neuronal network model demonstrate that hybrid parallelization yields significant performance benefits on clusters of multi-core computers, compared to purely MPIbased distributed simulation.
Neural Networks | 2001
Marc-Oliver Gewaltig; Markus Diesmann; Ad Aertsen
The synfire hypothesis states that under appropriate conditions volleys of synchronized spikes (pulse packets) can propagate through the cortical network by traveling along chains of groups of cortical neurons. Here, we present results from network simulations, taking full account of the variability in pulse packet realizations. We repeatedly stimulated a synfire chain of model neurons and estimated activity (a) and temporal jitter (sigma) of the spike response for each neuron group in the chain in many trials. The survival probability of the activity was assessed for each point in (a, sigma)-space. The results confirm and extend our earlier predictions based on single neuron properties and a deterministic state-space analysis [Diesmann, M., Gewaltig, M.-O., & Aertsen, A. (1999). Stable propagation of synchronous spiking in cortical neural networks. Nature, 402, 529-533].
Neuroinformatics | 2007
Robert C. Cannon; Marc-Oliver Gewaltig; Padraig Gleeson; Upinder S. Bhalla; Hugo Cornelis; Michael L. Hines; Fredrick W. Howell; Eilif Muller; Joel R. Stiles; Stefan Wils; Erik De Schutter
Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19–20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance.
Frontiers in Computational Neuroscience | 2008
Moritz Helias; Stefan Rotter; Marc-Oliver Gewaltig; Markus Diesmann
Hebbian learning in cortical networks during development and adulthood relies on the presence of a mechanism to detect correlation between the presynaptic and the postsynaptic spiking activity. Recently, the calcium concentration in spines was experimentally shown to be a correlation sensitive signal with the necessary properties: it is confined to the spine volume, it depends on the relative timing of pre- and postsynaptic action potentials, and it is independent of the spines location along the dendrite. NMDA receptors are a candidate mediator for the correlation dependent calcium signal. Here, we present a quantitative model of correlation detection in synapses based on the calcium influx through NMDA receptors under realistic conditions of irregular pre- and postsynaptic spiking activity with pairwise correlation. Our analytical framework captures the interaction of the learning rule and the correlation dynamics of the neurons. We find that a simple thresholding mechanism can act as a sensitive and reliable correlation detector at physiological firing rates. Furthermore, the mechanism is sensitive to correlation among afferent synapses by cooperation and competition. In our model this mechanism controls synapse formation and elimination. We explain how synapse elimination leads to firing rate homeostasis and show that the connectivity structure is shaped by the correlations between neighboring inputs.
Neurocomputing | 2001
Markus Diesmann; Marc-Oliver Gewaltig; Stefan Rotter; Ad Aertsen
Abstract Recent proposals that information in cortical neurons may be encoded by precise spike timing have been challenged by the assumption that neurons in vivo can only operate in a noisy fashion, due to large fluctuations in synaptic input activity. Here, we show that despite the background, volleys of precisely synchronized action potentials can stably propagate within a model network of basic integrate-and-fire neurons. The construction of an iterative mapping for the transmission of synchronized spikes between groups of neurons allows for a two-dimensional state space analysis. An attractor, yielding stable spiking precision in the (sub-)millisecond range, governs the synchronization dynamics.
Frontiers in Computational Neuroscience | 2012
Andreas Knoblauch; Florian Hauser; Marc-Oliver Gewaltig; Edgar Körner; Günther Palm
Spike synchronization is thought to have a constructive role for feature integration, attention, associative learning, and the formation of bidirectionally connected Hebbian cell assemblies. By contrast, theoretical studies on spike-timing-dependent plasticity (STDP) report an inherently decoupling influence of spike synchronization on synaptic connections of coactivated neurons. For example, bidirectional synaptic connections as found in cortical areas could be reproduced only by assuming realistic models of STDP and rate coding. We resolve this conflict by theoretical analysis and simulation of various simple and realistic STDP models that provide a more complete characterization of conditions when STDP leads to either coupling or decoupling of neurons firing in synchrony. In particular, we show that STDP consistently couples synchronized neurons if key model parameters are matched to physiological data: First, synaptic potentiation must be significantly stronger than synaptic depression for small (positive or negative) time lags between presynaptic and postsynaptic spikes. Second, spike synchronization must be sufficiently imprecise, for example, within a time window of 5-10 ms instead of 1 ms. Third, axonal propagation delays should not be much larger than dendritic delays. Under these assumptions synchronized neurons will be strongly coupled leading to a dominance of bidirectional synaptic connections even for simple STDP models and low mean firing rates at the level of spontaneous activity.