Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eric Shea-Brown is active.

Publication


Featured researches published by Eric Shea-Brown.


Nature | 2007

Correlation between neural spike trains increases with firing rate

Jaime de la Rocha; Brent Doiron; Eric Shea-Brown; Krešimir Josić; Alex D. Reyes

Populations of neurons in the retina, olfactory system, visual and somatosensory thalamus, and several cortical regions show temporal correlation between the discharge times of their action potentials (spike trains). Correlated firing has been linked to stimulus encoding, attention, stimulus discrimination, and motor behaviour. Nevertheless, the mechanisms underlying correlated spiking are poorly understood, and its coding implications are still debated. It is not clear, for instance, whether correlations between the discharges of two neurons are determined solely by the correlation between their afferent currents, or whether they also depend on the mean and variance of the input. We addressed this question by computing the spike train correlation coefficient of unconnected pairs of in vitro cortical neurons receiving correlated inputs. Notably, even when the input correlation remained fixed, the spike train output correlation increased with the firing rate, but was largely independent of spike train variability. With a combination of analytical techniques and numerical simulations using ‘integrate-and-fire’ neuron models we show that this relationship between output correlation and firing rate is robust to input heterogeneities. Finally, this overlooked relationship is replicated by a standard threshold-linear model, demonstrating the universality of the result. This connection between the rate and correlation of spiking activity links two fundamental features of the neural code.


Journal of Neural Engineering | 2007

Toward closed-loop optimization of deep brain stimulation for Parkinson's disease: concepts and lessons from a computational model

Xiao-Jiang Feng; Brian Greenwald; Herschel Rabitz; Eric Shea-Brown; Robert L. Kosut

Deep brain stimulation (DBS) of the subthalamic nucleus with periodic, high-frequency pulse trains is an increasingly standard therapy for advanced Parkinsons disease. Here, we propose that a closed-loop global optimization algorithm may identify novel DBS waveforms that could be more effective than their high-frequency counterparts. We use results from a computational model of the Parkinsonian basal ganglia to illustrate general issues relevant to eventual clinical or experimental tests of such an algorithm. Specifically, while the relationship between DBS characteristics and performance is highly complex, global search methods appear able to identify novel and effective waveforms with convergence rates that are acceptably fast to merit further investigation in laboratory or clinical settings.


Physical Review Letters | 2008

Correlation and Synchrony Transfer in Integrate-and-Fire Neurons: Basic Properties and Consequences for Coding

Eric Shea-Brown; Krešimir Josić; Jaime de la Rocha; Brent Doiron

One of the fundamental characteristics of a nonlinear system is how it transfers correlations in its inputs to correlations in its outputs. This is particularly important in the nervous system, where correlations between spiking neurons are prominent. Using linear response and asymptotic methods for pairs of unconnected integrate-and-fire (IF) neurons receiving white noise inputs, we show that this correlation transfer depends on the output spike firing rate in a strong, stereotyped manner, and is, surprisingly, almost independent of the interspike variance. For cells receiving heterogeneous inputs, we further show that correlation increases with the geometric mean spiking rate in the same stereotyped manner, greatly extending the generality of this relationship. We present an immediate consequence of this relationship for population coding via tuning curves.


Journal of Computational and Nonlinear Dynamics | 2006

Optimal Inputs for Phase Models of Spiking Neurons

Jeff Moehlis; Eric Shea-Brown; Herschel Rabitz

Variational methods are used to determine the optimal currents that elicit spikes in various phase reductions of neural oscillator models. We show that, for a given reduced neuron model and target spike time, there is a unique current that minimizes a squareintegral measure of its amplitude. For intrinsically oscillatory models, we further demonstrate that the form and scaling of this current is determined by the model’s phase response curve. These results reflect the role of intrinsic neural dynamics in determining the time course of synaptic inputs to which a neuron is optimally tuned to respond, and are illustrated using phase reductions of neural models valid near typical bifurcations to periodic firing, as well as the Hodgkin-Huxley equations. DOI: 10.1115/1.2338654 Phase-reduced models of neurons have traditionally been used to investigate either the patterns of synchrony that result from the type and architecture of coupling 1–8 or the response of large groups of oscillators to external stimuli 9–11. In all of these cases, the inputs to the model cells were fixed by definition of the model at the outset and the dynamics of phase models of networks or populations were analyzed in detail. The present paper takes a complementary, control-theoretic approach that is related to probabilistic “spike-triggered” methods 12: we fix at the outset a feature of the dynamical trajectories of interest—spiking at a precise time t1—and study the neural inputs that lead to this outcome. By computing the optimal such input, according to a measure of the input strength required to elicit the spike, we identify the signal to which the neuron is optimally “tuned” to respond. We view the present work as part of the first attempts 13,14 to understand the dynamical response of neurons using control theory, and, as we expect that insights from this general perspective will be combined with the “forward” dynamics results that Phil Holmes and many others have derived to ultimately enhance our understanding of neural processing, we hope that it will serve as a fitting tribute to his work.


PLOS Computational Biology | 2012

Impact of Network Structure and Cellular Response on Spike Time Correlations

James Trousdale; Yu Hu; Eric Shea-Brown; Krešimir Josić

Novel experimental techniques reveal the simultaneous activity of larger and larger numbers of neurons. As a result there is increasing interest in the structure of cooperative – or correlated – activity in neural populations, and in the possible impact of such correlations on the neural code. A fundamental theoretical challenge is to understand how the architecture of network connectivity along with the dynamical properties of single cells shape the magnitude and timescale of correlations. We provide a general approach to this problem by extending prior techniques based on linear response theory. We consider networks of general integrate-and-fire cells with arbitrary architecture, and provide explicit expressions for the approximate cross-correlation between constituent cells. These correlations depend strongly on the operating point (input mean and variance) of the neurons, even when connectivity is fixed. Moreover, the approximations admit an expansion in powers of the matrices that describe the network architecture. This expansion can be readily interpreted in terms of paths between different cells. We apply our results to large excitatory-inhibitory networks, and demonstrate first how precise balance – or lack thereof – between the strengths and timescales of excitatory and inhibitory synapses is reflected in the overall correlation structure of the network. We then derive explicit expressions for the average correlation structure in randomly connected networks. These expressions help to identify the important factors that shape coordinated neural activity in such networks.


Physical Review E | 2011

Stochastic differential equation models for ion channel noise in Hodgkin-Huxley neurons.

Joshua H. Goldwyn; Nikita S. Imennov; Michael Famulare; Eric Shea-Brown

The random transitions of ion channels between conducting and nonconducting states generate a source of internal fluctuations in a neuron, known as channel noise. The standard method for modeling the states of ion channels nonlinearly couples continuous-time Markov chains to a differential equation for voltage. Beginning with the work of R. F. Fox and Y.-N. Lu [Phys. Rev. E 49, 3421 (1994)], there have been attempts to generate simpler models that use stochastic differential equation (SDEs) to approximate the stochastic spiking activity produced by Markov chain models. Recent numerical investigations, however, have raised doubts that SDE models can capture the stochastic dynamics of Markov chain models.We analyze three SDE models that have been proposed as approximations to the Markov chain model: one that describes the states of the ion channels and two that describe the states of the ion channel subunits. We show that the former channel-based approach can capture the distribution of channel noise and its effects on spiking in a Hodgkin-Huxley neuron model to a degree not previously demonstrated, but the latter two subunit-based approaches cannot. Our analysis provides intuitive and mathematical explanations for why this is the case. The temporal correlation in the channel noise is determined by the combinatorics of bundling subunits into channels, but the subunit-based approaches do not correctly account for this structure. Our study confirms and elucidates the findings of previous numerical investigations of subunit-based SDE models. Moreover, it presents evidence that Markov chain models of the nonlinear, stochastic dynamics of neural membranes can be accurately approximated by SDEs. This finding opens a door to future modeling work using SDE techniques to further illuminate the effects of ion channel fluctuations on electrically active cells.


Neural Computation | 2009

Stimulus-dependent correlations and population codes

Krešimir Josić; Eric Shea-Brown; Brent Doiron; Jaime de la Rocha

The magnitude of correlations between stimulus-driven responses of pairs of neurons can itself be stimulus dependent. We examine how this dependence affects the information carried by neural populations about the stimuli that drive them. Stimulus-dependent changes in correlations can both carry information directly and modulate the information separately carried by the firing rates and variances. We use Fisher information to quantify these effects and show that, although stimulus-dependent correlations often carry little information directly, their modulatory effects on the overall information can be large. In particular, if the stimulus dependence is such that correlations increase with stimulus-induced firing rates, this can significantly enhance the information of the population when the structure of correlations is determined solely by the stimulus. However, in the presence of additional strong spatial decay of correlations, such stimulus dependence may have a negative impact. Opposite relationships hold when correlations decrease with firing rates.


Physical Review E | 2010

Time scales of spike-train correlation for neural oscillators with common drive

Andrea K. Barreiro; Eric Shea-Brown; Evan L. Thilo

We examine the effect of the phase-resetting curve on the transfer of correlated input signals into correlated output spikes in a class of neural models receiving noisy superthreshold stimulation. We use linear-response theory to approximate the spike correlation coefficient in terms of moments of the associated exit time problem and contrast the results for type I vs type II models and across the different time scales over which spike correlations can be assessed. We find that, on long time scales, type I oscillators transfer correlations much more efficiently than type II oscillators. On short time scales this trend reverses, with the relative efficiency switching at a time scale that depends on the mean and standard deviation of input currents. This switch occurs over time scales that could be exploited by downstream circuits.


PLOS Computational Biology | 2014

The sign rule and beyond: boundary effects, flexibility, and noise correlations in neural population codes.

Yu Hu; Joel Zylberberg; Eric Shea-Brown

Over repeat presentations of the same stimulus, sensory neurons show variable responses. This “noise” is typically correlated between pairs of cells, and a question with rich history in neuroscience is how these noise correlations impact the populations ability to encode the stimulus. Here, we consider a very general setting for population coding, investigating how information varies as a function of noise correlations, with all other aspects of the problem – neural tuning curves, etc. – held fixed. This work yields unifying insights into the role of noise correlations. These are summarized in the form of theorems, and illustrated with numerical examples involving neurons with diverse tuning curves. Our main contributions are as follows. (1) We generalize previous results to prove a sign rule (SR) — if noise correlations between pairs of neurons have opposite signs vs. their signal correlations, then coding performance will improve compared to the independent case. This holds for three different metrics of coding performance, and for arbitrary tuning curves and levels of heterogeneity. This generality is true for our other results as well. (2) As also pointed out in the literature, the SR does not provide a necessary condition for good coding. We show that a diverse set of correlation structures can improve coding. Many of these violate the SR, as do experimentally observed correlations. There is structure to this diversity: we prove that the optimal correlation structures must lie on boundaries of the possible set of noise correlations. (3) We provide a novel set of necessary and sufficient conditions, under which the coding performance (in the presence of noise) will be as good as it would be if there were no noise present at all.


Journal of Nonlinear Science | 2009

Reliability of Coupled Oscillators

Kevin K. Lin; Eric Shea-Brown; Lai Sang Young

We study the reliability of phase oscillator networks in response to fluctuating inputs. Reliability means that an input elicits essentially identical responses upon repeated presentations, regardless of the network’s initial condition. Single oscillators are well known to be reliable. We show in this paper that unreliable behavior can occur in a network as small as a coupled oscillator pair in which the signal is received by the first oscillator and relayed to the second with feedback. A geometric explanation based on shear-induced chaos at the onset of phase-locking is proposed. We treat larger networks as decomposed into modules connected by acyclic graphs, and give a mathematical analysis of the acyclic parts. Moreover, for networks in this class, we show how the source of unreliability can be localized, and address questions concerning downstream propagation of unreliability once it is produced.

Collaboration


Dive into the Eric Shea-Brown's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Fred Rieke

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yu Hu

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrea K. Barreiro

Southern Methodist University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nicholas Cain

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Brent Doiron

Courant Institute of Mathematical Sciences

View shared research outputs
Researchain Logo
Decentralizing Knowledge