Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Guillaume Lajoie is active.

Publication


Featured researches published by Guillaume Lajoie.


Physical Review E | 2013

Chaos and reliability in balanced spiking networks with temporal drive.

Guillaume Lajoie; Kevin K. Lin; Eric Shea-Brown

Biological information processing is often carried out by complex networks of interconnected dynamical units. A basic question about such networks is that of reliability: If the same signal is presented many times with the network in different initial states, will the system entrain to the signal in a repeatable way? Reliability is of particular interest in neuroscience, where large, complex networks of excitatory and inhibitory cells are ubiquitous. These networks are known to autonomously produce strongly chaotic dynamics-an obvious threat to reliability. Here, we show that such chaos persists in the presence of weak and strong stimuli, but that even in the presence of chaos, intermittent periods of highly reliable spiking often coexist with unreliable activity. We elucidate the local dynamical mechanisms involved in this intermittent reliability, and investigate the relationship between this phenomenon and certain time-dependent attractors arising from the dynamics. A conclusion is that chaotic dynamics do not have to be an obstacle to precise spike responses, a fact with implications for signal coding in large networks.


Siam Journal on Applied Dynamical Systems | 2011

Shared Inputs, Entrainment, and Desynchrony in Elliptic Bursters: From Slow Passage to Discontinuous Circle Maps ∗

Guillaume Lajoie; Eric Shea-Brown

What input signals will lead to synchrony vs. desynchrony in a group of biological oscillators? This question connects with both classical dynamical systems analyses of entrainment and phase locking and with emerging studies of stimulation patterns for controlling neural network activity. Here, we focus on the response of a population of uncoupled, elliptically bursting neurons to a common pulsatile input. We extend a phase reduction from the literature to capture inputs of varied strength, leading to a circle map with discontinuities of various orders. In a combined analytical and numerical approach, we apply our results to both a normal form model for elliptic bursting and to a biophysically-based neuron model from the basal ganglia. We find that, depending on the period and amplitude of inputs, the response can either appear chaotic (with provably positive Lyaponov exponent for the associated circle maps), or periodic with a broad range of phase-locked periods. Throughout, we discuss the critical underlying mechanisms, including slow-passage effects through Hopf bifurcation, the role and origin of discontinuities, and the impact of noise


PLOS Computational Biology | 2016

Encoding in Balanced Networks: Revisiting Spike Patterns and Chaos in Stimulus-Driven Systems

Guillaume Lajoie; Kevin K. Lin; Jean-Philippe Thivierge; Eric Shea-Brown

Highly connected recurrent neural networks often produce chaotic dynamics, meaning their precise activity is sensitive to small perturbations. What are the consequences of chaos for how such networks encode streams of temporal stimuli? On the one hand, chaos is a strong source of randomness, suggesting that small changes in stimuli will be obscured by intrinsically generated variability. On the other hand, recent work shows that the type of chaos that occurs in spiking networks can have a surprisingly low-dimensional structure, suggesting that there may be room for fine stimulus features to be precisely resolved. Here we show that strongly chaotic networks produce patterned spikes that reliably encode time-dependent stimuli: using a decoder sensitive to spike times on timescales of 10’s of ms, one can easily distinguish responses to very similar inputs. Moreover, recurrence serves to distribute signals throughout chaotic networks so that small groups of cells can encode substantial information about signals arriving elsewhere. A conclusion is that the presence of strong chaos in recurrent networks need not exclude precise encoding of temporal stimuli via spike patterns.Guillaume Lajoie, 2 Kevin K. Lin, Jean-Philippe Thivierge, and Eric Shea-Brown 6 University of Washington Institute for Neuroengineering Formerly at Max Plank Institute for Dynamics and Self-Organization, Dept. of Nonlinear Dynamics University of Arizona, School of Mathematics University of Ottawa, School of Psychology University of Washington, Dept. of Applied Mathematics University of Washington, Dept. of Physiology and Biophysics (Dated: April 27, 2016)


PLOS Computational Biology | 2017

Correlation-based model of artificially induced plasticity in motor cortex by a bidirectional brain-computer interface

Guillaume Lajoie; Nedialko I. Krouchev; John F. Kalaska; Adrienne L. Fairhall; Eberhard E. Fetz

Experiments show that spike-triggered stimulation performed with Bidirectional Brain-Computer-Interfaces (BBCI) can artificially strengthen connections between separate neural sites in motor cortex (MC). When spikes from a neuron recorded at one MC site trigger stimuli at a second target site after a fixed delay, the connections between sites eventually strengthen. It was also found that effective spike-stimulus delays are consistent with experimentally derived spike-timing-dependent plasticity (STDP) rules, suggesting that STDP is key to drive these changes. However, the impact of STDP at the level of circuits, and the mechanisms governing its modification with neural implants remain poorly understood. The present work describes a recurrent neural network model with probabilistic spiking mechanisms and plastic synapses capable of capturing both neural and synaptic activity statistics relevant to BBCI conditioning protocols. Our model successfully reproduces key experimental results, both established and new, and offers mechanistic insights into spike-triggered conditioning. Using analytical calculations and numerical simulations, we derive optimal operational regimes for BBCIs, and formulate predictions concerning the efficacy of spike-triggered conditioning in different regimes of cortical activity.


Journal of Computational Neuroscience | 2016

Driving reservoir models with oscillations: a solution to the extreme structural sensitivity of chaotic networks

Philippe Vincent-Lamarre; Guillaume Lajoie; Jean-Philippe Thivierge

A large body of experimental and theoretical work on neural coding suggests that the information stored in brain circuits is represented by time-varying patterns of neural activity. Reservoir computing, where the activity of a recurrently connected pool of neurons is read by one or more units that provide an output response, successfully exploits this type of neural activity. However, the question of system robustness to small structural perturbations, such as failing neurons and synapses, has been largely overlooked. This contrasts with well-studied dynamical perturbations that lead to divergent network activity in the presence of chaos, as is the case for many reservoir networks. Here, we distinguish between two types of structural network perturbations, namely local (e.g., individual synaptic or neuronal failure) and global (e.g., network-wide fluctuations). Surprisingly, we show that while global perturbations have a limited impact on the ability of reservoir models to perform various tasks, local perturbations can produce drastic effects. To address this limitation, we introduce a new architecture where the reservoir is driven by a layer of oscillators that generate stable and repeatable trajectories. This model outperforms previous implementations while being resistant to relatively large local and global perturbations. This finding has implications for the design of reservoir models that capture the capacity of brain circuits to perform cognitively and behaviorally relevant tasks while remaining robust to various forms of perturbations. Further, our work proposes a novel role for neuronal oscillations found in cortical circuits, where they may serve as a collection of inputs from which a network can robustly generate complex dynamics and implement rich computations.


Neural Computation | 2016

Dynamic signal tracking in a simple v1 spiking model

Guillaume Lajoie; Lai Sang Young

This work is part of an effort to understand the neural basis for our visual system’s ability, or failure, to accurately track moving visual signals. We consider here a ring model of spiking neurons, intended as a simplified computational model of a single hypercolumn of the primary visual cortex of primates. Signals that consist of edges with time-varying orientations localized in space are considered. Our model is calibrated to produce spontaneous and driven firing rates roughly consistent with experiments, and our two main findings, for which we offer dynamical explanation on the level of neuronal interactions, are the following. First, we have documented consistent transient overshoots in signal perception following signal switches due to emergent interactions of the E- and I-populations. Second, for continuously moving signals, we have found that accuracy is considerably lower at reversals of orientation than when continuing in the same direction (as when the signal is a rotating bar). To measure performance, we use two metrics, called fidelity and reliability, to compare signals reconstructed by the system to the ones presented and assess trial-to-trial variability. We propose that the same population mechanisms responsible for orientation selectivity also impose constraints on dynamic signal tracking that manifest in perception failures consistent with psychophysical observations.


BMC Neuroscience | 2015

Extreme sensitivity of reservoir computing to small network disruptions

Philippe Vincent-Lamarre; Guillaume Lajoie; Jean-Philippe Thivierge

Recent computational models based on reservoir computing (RC) are gaining attention as plausible theories of cortical information processing. In these models, the activity of a recurrently connected population of neurons is sent to one or many read-out units through a linear transformation. These models can operate in a chaotic regime which has been proposed as a possible mechanism underlying sustained irregular activity observed in cortical areas [1,2]. Furthermore, models based on RC replicate the neural dynamics involved in decision making [3], interval timing [2], and motor control [1]. However, one biological constraint that has been overlooked in these models is their resistance to small connectivity perturbations such as failures in synaptic transmission, a phenomenon that occurs frequently in healthy circuits without causing any drastic functional changes. Here, we show that different implementations of RC display very little resistance to small synaptic disruptions and discuss the implications of such fragility for RC mechanisms that may be present in neural coding. With the FORCE [1] procedure, networks lost their ability to replicate a jagged sinusoidal signal after a single neuron was removed from the reservoir (Figure ​(Figure1A).1A). Networks with innate training [2] showed a similar effect on a timing task (Figure ​(Figure1B).1B). The lag in the timing and the noise in the output both increased monotonically as further neurons were removed (Figure 1C,D); networks reached random performance after ~1.5% of neurons were eliminated. After the suppression of a single neuron, the spectrum of the weight matrix was greatly disturbed and repeated trials displayed unreliable trajectories, as assessed with principal components analysis. When individual synapses were removed instead of neurons, networks reached random performance after ~0.5% of synapses from the reservoir were eliminated. While living neuronal circuits can withstand small synaptic disruptions without compromising task performance, our results suggest that such disruptions have a catastrophic impact on the behaviour of RC models. Retraining the read-out unit seems to be futile as it results as a completely new solution post retraining instead of a finer restructuration. These results cast doubt on the validity of a large class of models that claim to capture the neuronal mechanisms of cognitive and behavioral tasks. Figure 1 Performance of damaged reservoirs of 1,000 neurons with FORCE and innate learning algorithms. A. Target signal (green, perfectly replicated with the originally trained network) and the trace of the same network after the removal of one neuron in its reservoir. ...


BMC Neuroscience | 2014

Structured chaos shapes joint spike-response noise entropy in temporally driven balanced networks

Guillaume Lajoie; Jean-Philippe Thivierge; Eric Shea-Brown

How variable and noisy is the neural code arising from the joint activity of recurrently connected cells? Isolated neurons are known to respond to fluctuating input currents with reliable spike patterns [1,2], but variability in stimulus-evoked spike trains is increasingly pronounced in deeper, more recurrently connected brain areas such as cortex [3]. What are the network-level sources of this variability, and how they might constrain spiking features relevant for coding remains an open question. We focus on spiking model networks with sparse, random connectivity and balanced excitation and inhibition that reproduce the irregular firing that typifies cortical activity. In such models, activity is known to be chaotic, with extremely strong sensitivity of spike outputs on tiny changes in a network’s initial conditions [4-6]. Nevertheless, when subject to temporally fluctuating driving inputs, networks can have chaotic attractors of limited dimension and geometric properties leading to reduced spiking variability at the single-cell level [7]. As recent studies suggest that the impact of noise on network coding cannot be understood by single cell properties alone [8,9], we study mechanisms underlying the joint activity of entire networks. We derive a bound for the entropy of joint spike pattern distributions in large spiking model networks in response to a fluctuating temporal signal. The analysis is based on results from random dynamical systems theory and complimented by detailed numerical simulations. We find that despite very weak conditional correlations between neurons, the resulting joint variability of network responses is surprisingly lower than what would be expected by considering only limited statistical neural interactions. Moreover, joint spiking variability is strongly constrained by the level of temporal features of input stimuli.


Frontiers in Computational Neuroscience | 2014

Structured chaos shapes spike-response noise entropy in balanced neural networks

Guillaume Lajoie; Jean-Philippe Thivierge; Eric Shea-Brown


Archive | 2012

Chaos and reliability in balanced spiking networks

Guillaume Lajoie; Kevin K. Lin; Eric Shea-Brown

Collaboration


Dive into the Guillaume Lajoie's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nedialko I. Krouchev

Montreal Neurological Institute and Hospital

View shared research outputs
Researchain Logo
Decentralizing Knowledge