Lyle Muller
Centre national de la recherche scientifique
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lyle Muller.
Biological Cybernetics | 2011
Daniel Brüderle; Mihai A. Petrovici; Bernhard Vogginger; Matthias Ehrlich; Thomas Pfeil; Sebastian Millner; Andreas Grübl; Karsten Wendt; Eric Müller; Marc-Olivier Schwartz; Dan Husmann de Oliveira; Sebastian Jeltsch; Johannes Fieres; Moritz Schilling; Paul Müller; Oliver Breitwieser; Venelin Petkov; Lyle Muller; Andrew P. Davison; Pradeep Krishnamurthy; Jens Kremkow; Mikael Lundqvist; Eilif Muller; Johannes Partzsch; Stefan Scholze; Lukas Zühl; Christian Mayr; Alain Destexhe; Markus Diesmann; Tobias C. Potjans
In this article, we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware–software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results.
Nature Communications | 2014
Lyle Muller; Alexandre Reynaud; Frédéric Chavane; Alain Destexhe
Propagating waves occur in many excitable media and were recently found in neural systems from retina to neocortex. While propagating waves are clearly present under anaesthesia, whether they also appear during awake and conscious states remains unclear. One possibility is that these waves are systematically missed in trial-averaged data, due to variability. Here we present a method for detecting propagating waves in noisy multichannel recordings. Applying this method to single-trial voltage-sensitive dye imaging data, we show that the stimulus-evoked population response in primary visual cortex of the awake monkey propagates as a travelling wave, with consistent dynamics across trials. A network model suggests that this reliability is the hallmark of the horizontal fibre network of superficial cortical layers. Propagating waves with similar properties occur independently in secondary visual cortex, but maintain precise phase relations with the waves in primary visual cortex. These results show that, in response to a visual stimulus, propagating waves are systematically evoked in several visual areas, generating a consistent spatiotemporal frame for further neuronal interactions.
Proceedings of the National Academy of Sciences of the United States of America | 2016
Michel Le Van Quyen; Lyle Muller; Bartosz Telenczuk; Eric Halgren; Sydney S. Cash; Nicholas G. Hatsopoulos; Nima Dehghani; Alain Destexhe
Significance We show in humans that in comparison to excitatory cells, inhibitory neurons have a stronger spiking activity during γ oscillations in the wake–sleep cycle. During β-oscillations in monkey neocortex, inhibitory cells show more active firing. Unlike excitatory cells, inhibitory cells show correlations during slow-wave sleep fast oscillations over several millimeters in the neocortex. During both wake and sleep, β- and γ-waves systematically propagate with a dominant trajectory across the array with similar velocities. These findings suggest that inhibition-driven β- and γ-oscillations may contribute to the reactivation of information during sleep through orchestrating highly coherent spiking activity patterns. Beta (β)- and gamma (γ)-oscillations are present in different cortical areas and are thought to be inhibition-driven, but it is not known if these properties also apply to γ-oscillations in humans. Here, we analyze such oscillations in high-density microelectrode array recordings in human and monkey during the wake–sleep cycle. In these recordings, units were classified as excitatory and inhibitory cells. We find that γ-oscillations in human and β-oscillations in monkey are characterized by a strong implication of inhibitory neurons, both in terms of their firing rate and their phasic firing with the oscillation cycle. The β- and γ-waves systematically propagate across the array, with similar velocities, during both wake and sleep. However, only in slow-wave sleep (SWS) β- and γ-oscillations are associated with highly coherent and functional interactions across several millimeters of the neocortex. This interaction is specifically pronounced between inhibitory cells. These results suggest that inhibitory cells are dominantly involved in the genesis of β- and γ-oscillations, as well as in the organization of their large-scale coherence in the awake and sleeping brain. The highest oscillation coherence found during SWS suggests that fast oscillations implement a highly coherent reactivation of wake patterns that may support memory consolidation during SWS.
PLOS ONE | 2014
Mihai A. Petrovici; Bernhard Vogginger; Paul Müller; Oliver Breitwieser; Mikael Lundqvist; Lyle Muller; Matthias Ehrlich; Alain Destexhe; Anders Lansner; René Schüffny; Johannes Schemmel; K. Meier
Advancing the size and complexity of neural network models leads to an ever increasing demand for computational resources for their simulation. Neuromorphic devices offer a number of advantages over conventional computing architectures, such as high emulation speed or low power consumption, but this usually comes at the price of reduced configurability and precision. In this article, we investigate the consequences of several such factors that are common to neuromorphic devices, more specifically limited hardware resources, limited parameter configurability and parameter variations due to fixed-pattern noise and trial-to-trial variability. Our final aim is to provide an array of methods for coping with such inevitable distortion mechanisms. As a platform for testing our proposed strategies, we use an executable system specification (ESS) of the BrainScaleS neuromorphic system, which has been designed as a universal emulation back-end for neuroscientific modeling. We address the most essential limitations of this device in detail and study their effects on three prototypical benchmark network models within a well-defined, systematic workflow. For each network model, we start by defining quantifiable functionality measures by which we then assess the effects of typical hardware-specific distortion mechanisms, both in idealized software simulations and on the ESS. For those effects that cause unacceptable deviations from the original network dynamics, we suggest generic compensation mechanisms and demonstrate their effectiveness. Both the suggested workflow and the investigated compensation mechanisms are largely back-end independent and do not require additional hardware configurability beyond the one required to emulate the benchmark networks in the first place. We hereby provide a generic methodological environment for configurable neuromorphic devices that are targeted at emulating large-scale, functional neural networks.
eLife | 2016
Lyle Muller; Giovanni Piantoni; Dominik Koller; Sydney S. Cash; Eric Halgren; Terrence J. Sejnowski
During sleep, the thalamus generates a characteristic pattern of transient, 11-15 Hz sleep spindle oscillations, which synchronize the cortex through large-scale thalamocortical loops. Spindles have been increasingly demonstrated to be critical for sleep-dependent consolidation of memory, but the specific neural mechanism for this process remains unclear. We show here that cortical spindles are spatiotemporally organized into circular wave-like patterns, organizing neuronal activity over tens of milliseconds, within the timescale for storing memories in large-scale networks across the cortex via spike-time dependent plasticity. These circular patterns repeat over hours of sleep with millisecond temporal precision, allowing reinforcement of the activity patterns through hundreds of reverberations. These results provide a novel mechanistic account for how global sleep oscillations and synaptic plasticity could strengthen networks distributed across the cortex to store coherent and integrated memories. DOI: http://dx.doi.org/10.7554/eLife.17267.001
Nature Reviews Neuroscience | 2018
Lyle Muller; Frédéric Chavane; John H. Reynolds; Terrence J. Sejnowski
Multichannel recording technologies have revealed travelling waves of neural activity in multiple sensory, motor and cognitive systems. These waves can be spontaneously generated by recurrent circuits or evoked by external stimuli. They travel along brain networks at multiple scales, transiently modulating spiking and excitability as they pass. Here, we review recent experimental findings that have found evidence for travelling waves at single-area (mesoscopic) and whole-brain (macroscopic) scales. We place these findings in the context of the current theoretical understanding of wave generation and propagation in recurrent networks. During the large low-frequency rhythms of sleep or the relatively desynchronized state of the awake cortex, travelling waves may serve a variety of functions, from long-term memory consolidation to processing of dynamic visual stimuli. We explore new avenues for experimental and computational understanding of the role of spatiotemporal activity patterns in the cortex.
New Journal of Physics | 2014
Lyle Muller; Alain Destexhe; Michelle Rudolph-Lilith
Since its introduction, the ‘small-world’ effect has played a central role in network science, particularly in the analysis of the complex networks of the nervous system. From the cellular level to that of interconnected cortical regions, many analyses have revealed small-world properties in the networks of the brain. In this work, we revisit the quantification of small-worldness in neural graphs. We find that neural graphs fall into the ‘borderline’ regime of small-worldness, residing close to that of a random graph, especially when the degree sequence of the network is taken into account. We then apply recently introducted analytical expressions for clustering and distance measures, to study this borderline small-worldness regime. We derive theoretical bounds for the minimal and maximal small-worldness index for a given graph, and by semi-analytical means, study the small-worldness index itself. With this approach, we find that graphs with small-worldness equivalent to that observed in experimental data are dominated by their random component. These results provide the first thorough analysis suggesting that neural graphs may reside far away from the maximally small-world regime.
BMC Neuroscience | 2013
Lyle Muller; Alexandre Reynaud; Frédéric Chavane; Alain Destexhe
Propagating waves of activity are seen in many types of excitable media, and in recent years, were found in the neocortex of anesthetized animals [1,2]. To date, however, it still remains unclear whether propagating waves appear during awake and conscious states [3,4]. One possibility is that these waves are systematically missed in trial-averaged data, because of their well-known variability from trial to trial [1]. To test this hypothesis, we developed a phase-based analysis technique, which works on a pixel-by-pixel basis in the unsmoothed data, and provides a quantiative means to distinguish between spatiotemporal forms of the population response. We then applied this to single-trial voltage sensitive dye imaging (VSDI) data, denoised specifically for this purpose [5], and in this work, we show definitively that spontaeous and stimulus-evoked propagating waves occur in the visual cortex of the awake monkey. Furthermore, when looking at the multiple visual areas within the imaging field in these experiments, we observe correlated propagations across primary and secondary visual cortex, illustrating a strong spatiotemporal organization of these waves across cortical areas. These results demonstrate that propagating waves are systematically and reliably evoked by sensory stimulation, and suggest that they have the potential to affect large-scale information processing by generating a consistent spatiotemporal frame for neuronal interactions. The horizontal fiber network mediating these activity patterns has previously been implicated in active computational roles, as ascending input at a given point in cortex is known to affect the processing of future stimuli across the cortical plane [6,7]. In this work, we implicate these propagations in a specific functional role. These internally generated propagating waves provide a specific structure for the spatiotemporal activity in visual cortex, uniquely encoding both stimulus identity and time of presentation in the amplitude and phase of the population response [8]. With these results in mind, we go on to discuss the computational paradigms towards which our observations point, elucidating these with numerical models and further analysis.
Discrete Mathematics | 2014
Michelle Rudolph-Lilith; Lyle Muller
One of the simplest polynomial recursions exhibiting chaotic behavior is the logistic map x n + 1 = a x n ( 1 - x n ) with x n , a ? Q : x n ? 0 , 1 ] ? n ? N and a ? ( 0 , 4 ] , the discrete-time model of the differential growth introduced by Verhulst almost two centuries ago (Verhulst, 1838)? 12]. Despite the importance of this discrete map for the field of nonlinear science, explicit solutions are known only for the special cases a = 2 and a = 4 . In this article, we propose a representation of the Verhulst logistic map in terms of a finite power series in the maps growth parameter a and initial value x 0 whose coefficients are given by the solution of a system of linear equations. Although the proposed representation cannot be viewed as a closed-form solution of the logistic map, it may help to reveal the sensitivity of the map on its initial value and, thus, could provide insights into the mathematical description of chaotic dynamics.
Neurophotonics | 2017
Sandrine Chemla; Lyle Muller; Alexandre Reynaud; Sylvain Takerkart; Alain Destexhe; Frédéric Chavane
Abstract. Voltage-sensitive dye imaging (VSDI) is a key neurophysiological recording tool because it reaches brain scales that remain inaccessible to other techniques. The development of this technique from in vitro to the behaving nonhuman primate has only been made possible thanks to the long-lasting, visionary work of Amiram Grinvald. This work has opened new scientific perspectives to the great benefit to the neuroscience community. However, this unprecedented technique remains largely under-utilized, and many future possibilities await for VSDI to reveal new functional operations. One reason why this tool has not been used extensively is the inherent complexity of the signal. For instance, the signal reflects mainly the subthreshold neuronal population response and is not linked to spiking activity in a straightforward manner. Second, VSDI gives access to intracortical recurrent dynamics that are intrinsically complex and therefore nontrivial to process. Computational approaches are thus necessary to promote our understanding and optimal use of this powerful technique. Here, we review such approaches, from computational models to dissect the mechanisms and origin of the recorded signal, to advanced signal processing methods to unravel new neuronal interactions at mesoscopic scale. Only a stronger development of interdisciplinary approaches can bridge micro- to macroscales.