Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cheng Ly is active.

Publication


Featured researches published by Cheng Ly.


Neural Computation | 2007

Critical Analysis of Dimension Reduction by a Moment Closure Method in a Population Density Approach to Neural Network Modeling

Cheng Ly; Daniel Tranchina

Computational techniques within the population density function (PDF) framework have provided time-saving alternatives to classical Monte Carlo simulations of neural network activity. Efficiency of the PDF method is lost as the underlying neuron model is made more realistic and the number of state variables increases. In a detailed theoretical and computational study, we elucidate strengths and weaknesses of dimension reduction by a particular moment closure method (Cai, Tao, Shelley, & McLaughlin, 2004; Cai, Tao, Rangan, & McLaughlin, 2006) as applied to integrate-and-fire neurons that receive excitatory synaptic input only. When the unitary postsynaptic conductance event has a single-exponential time course, the evolution equation for the PDF is a partial differential integral equation in two state variables, voltage and excitatory conductance. In the moment closure method, one approximates the conditional kth centered moment of excitatory conductance given voltage by the corresponding unconditioned moment. The result is a system of k coupled partial differential equations with one state variable, voltage, and k coupled ordinary differential equations. Moment closure at k = 2 works well, and at k = 3 works even better, in the regime of high dynamically varying synaptic input rates. Both closures break down at lower synaptic input rates. Phase-plane analysis of the k = 2 problem with typical parameters proves, and reveals why, no steady-state solutions exist below a synaptic input rate that gives a firing rate of 59 s1 in the full 2D problem. Closure at k = 3 fails for similar reasons. Low firing-rate solutions can be obtained only with parameters for the amplitude or kinetics (or both) of the unitary postsynaptic conductance event that are on the edge of the physiological range. We conclude that this dimension-reduction method gives ill-posed problems for a wide range of physiological parameters, and we suggest future directions.


Network: Computation In Neural Systems | 2006

Population density methods for stochastic neurons with realistic synaptic kinetics: Firing rate dynamics and fast computational methods

Felix Apfaltrer; Cheng Ly; Daniel Tranchina

An outstanding problem in computational neuroscience is how to use population density function (PDF) methods to model neural networks with realistic synaptic kinetics in a computationally efficient manner. We explore an application of two-dimensional (2-D) PDF methods to simulating electrical activity in networks of excitatory integrate-and-fire neurons. We formulate a pair of coupled partial differential–integral equations describing the evolution of PDFs for neurons in non-refractory and refractory pools. The population firing rate is given by the total flux of probability across the threshold voltage. We use an operator-splitting method to reduce computation time. We report on speed and accuracy of PDF results and compare them to those from direct, Monte–Carlo simulations. We compute temporal frequency response functions for the transduction from the rate of postsynaptic input to population firing rate, and examine its dependence on background synaptic input rate. The behaviors in the1-D and 2-D cases—corresponding to instantaneous and non-instantaneous synaptic kinetics, respectively—differ markedly from those for a somewhat different transduction: from injected current input to population firing rate output (; ). We extend our method by adding inhibitory input, consider a 3-D to 2-D dimension reduction method, demonstrate its limitations, and suggest directions for future study.


Journal of Computational Neuroscience | 2009

Synchronization dynamics of two coupled neural oscillators receiving shared and unshared noisy stimuli

Cheng Ly; G. Bard Ermentrout

The response of neurons to external stimuli greatly depends on the intrinsic dynamics of the network. Here, the intrinsic dynamics are modeled as coupling and the external input is modeled as shared and unshared noise. We assume the neurons are repetitively firing action potentials (i.e., neural oscillators), are weakly and identically coupled, and the external noise is weak. Shared noise can induce bistability between the synchronous and anti-phase states even though the anti-phase state is the only stable state in the absence of noise. We study the Fokker-Planck equation of the system and perform an asymptotic reduction ρ0. The ρ0 solution is more computationally efficient than both the Monte Carlo simulations and the 2D Fokker-Planck solver, and agrees remarkably well with the full system with weak noise and weak coupling. With moderate noise and coupling, ρ0 is still qualitatively correct despite the small noise and coupling assumption in the asymptotic reduction. Our phase model accurately predicts the behavior of a realistic synaptically coupled Morris-Lecar system.


Frontiers in Computational Neuroscience | 2012

Cellular and circuit mechanisms maintain low spike co-variability and enhance population coding in somatosensory cortex.

Cheng Ly; Jason W. Middleton; Brent Doiron

The responses of cortical neurons are highly variable across repeated presentations of a stimulus. Understanding this variability is critical for theories of both sensory and motor processing, since response variance affects the accuracy of neural codes. Despite this influence, the cellular and circuit mechanisms that shape the trial-to-trial variability of population responses remain poorly understood. We used a combination of experimental and computational techniques to uncover the mechanisms underlying response variability of populations of pyramidal (E) cells in layer 2/3 of rat whisker barrel cortex. Spike trains recorded from pairs of E-cells during either spontaneous activity or whisker deflected responses show similarly low levels of spiking co-variability, despite large differences in network activation between the two states. We developed network models that show how spike threshold non-linearities dilute E-cell spiking co-variability during spontaneous activity and low velocity whisker deflections. In contrast, during high velocity whisker deflections, cancelation mechanisms mediated by feedforward inhibition maintain low E-cell pairwise co-variability. Thus, the combination of these two mechanisms ensure low E-cell population variability over a wide range of whisker deflection velocities. Finally, we show how this active decorrelation of population variability leads to a drastic increase in the population information about whisker velocity. The prevalence of spiking non-linearities and feedforward inhibition in the nervous system suggests that the mechanisms for low network variability presented in our study may generalize throughout the brain.


Neural Computation | 2009

Spike train statistics and dynamics with synaptic input from any renewal process: A population density approach

Cheng Ly; Daniel Tranchina

In the probability density function (PDF) approach to neural network modeling, a common simplifying assumption is that the arrival times of elementary postsynaptic events are governed by a Poisson process. This assumption ignores temporal correlations in the input that sometimes have important physiological consequences. We extend PDF methods to models with synaptic event times governed by any modulated renewal process. We focus on the integrate-and-fire neuron with instantaneous synaptic kinetics and a random elementary excitatory postsynaptic potential (EPSP), A. Between presynaptic events, the membrane voltage, v, decays exponentially toward rest, while s, the time since the last synaptic input event, evolves with unit velocity. When a synaptic event arrives, v jumps by A, and s is reset to zero. If v crosses the threshold voltage, an action potential occurs, and v is reset to vreset. The probability per unit time of a synaptic event at time t, given the elapsed time s since the last event, h(s, t), depends on specifics of the renewal process. We study how regularity of the train of synaptic input events affects output spike rate, PDF and coefficient of variation (CV) of the interspike interval, and the autocorrelation function of the output spike train. In the limit of a deterministic, clocklike train of input events, the PDF of the interspike interval converges to a sum of delta functions, with coefficients determined by the PDF for A. The limiting autocorrelation function of the output spike train is a sum of delta functions whose coefficients fall under a damped oscillatory envelope. When the EPSP CV, AA, is equal to 0.45, a CV for the intersynaptic event interval, TT 0.35, is functionally equivalent to a deterministic periodic train of synaptic input events (CV 0) with respect to spike statistics. We discuss the relevance to neural network simulations.


PLOS Computational Biology | 2009

Divisive Gain Modulation with Dynamic Stimuli in Integrate-and-Fire Neurons

Cheng Ly; Brent Doiron

The modulation of the sensitivity, or gain, of neural responses to input is an important component of neural computation. It has been shown that divisive gain modulation of neural responses can result from a stochastic shunting from balanced (mixed excitation and inhibition) background activity. This gain control scheme was developed and explored with static inputs, where the membrane and spike train statistics were stationary in time. However, input statistics, such as the firing rates of pre-synaptic neurons, are often dynamic, varying on timescales comparable to typical membrane time constants. Using a population density approach for integrate-and-fire neurons with dynamic and temporally rich inputs, we find that the same fluctuation-induced divisive gain modulation is operative for dynamic inputs driving nonequilibrium responses. Moreover, the degree of divisive scaling of the dynamic response is quantitatively the same as the steady-state responses—thus, gain modulation via balanced conductance fluctuations generalizes in a straight-forward way to a dynamic setting.


Journal of Computational Neuroscience | 2011

Phase-resetting curve determines how BK currents affect neuronal firing

Cheng Ly; Tamar Melman; Alison L. Barth; G. Bard Ermentrout

BK channels are large conductance potassium channels gated by calcium and voltage. Paradoxically, blocking these channels has been shown experimentally to increase or decrease the firing rate of neurons, depending on the neural subtype and brain region. The mechanism for how this current can alter the firing rates of different neurons remains poorly understood. Using phase-resetting curve (PRC) theory, we determine when BK channels increase or decrease the firing rates in neural models. The addition of BK currents always decreases the firing rate when the PRC has only a positive region. When the PRC has a negative region (type II), BK currents can increase the firing rate. The influence of BK channels on firing rate in the presence of other conductances, such as Im and Ih, as well as with different amplitudes of depolarizing input, were also investigated. These results provide a formal explanation for the apparently contradictory effects of BK channel antagonists on firing rates.


Siam Journal on Applied Mathematics | 2015

One-Dimensional Population Density Approaches to Recurrently Coupled Networks of Neurons with Noise

Wilten Nicola; Cheng Ly; Sue Ann Campbell

Mean-field systems have been previously derived for networks of coupled, two-dimensional, integrate-and-fire neurons such as the Izhikevich, adapting exponential, and quartic integrate-and-fire, among others. Unfortunately, the mean-field systems have a degree of frequency error, and the networks analyzed often do not include noise when there is adaptation. Here, we derive a one-dimensional partial differential equation (PDE) approximation for the marginal voltage density under a first order moment closure for coupled networks of integrate-and-fire neurons with white noise inputs. The PDE has substantially less frequency error than the mean-field system and provides a great deal more information, at the cost of analytical tractability. The convergence properties of the mean-field system in the low noise limit are elucidated. A novel method for the analysis of the stability of the asynchronous tonic firing solution is also presented and implemented. Unlike in previous attempts at stability analysis with these network types, information about the marginal densities of the adaptation variables is used. This method can in principle be applied to other systems with nonlinear PDEs.


Siam Journal on Applied Dynamical Systems | 2010

Analysis of Recurrent Networks of Pulse-Coupled Noisy Neural Oscillators

Cheng Ly; G. Bard Ermentrout

Synchronization of neural oscillators has been well studied by both theorists and experimentalists. However, realistic details are often disregarded for tractability. Here, we consider a recurrent network of pulse-coupled neural oscillators since synaptic communication is often mediated by spikes. Neurons receive many stochastic inputs that have effects depending on the state of the neuron; thus, we incorporate phase-dependent (multiplicative) noise. Previous analysis of neural oscillators with additive noise (not necessarily weak) cannot be directly applied here because this results in analytically intractable equations that possibly have singularities. However, assuming weak coupling and weak noise, we accurately and analytically characterize various phenomena by a linear stability analysis around an asymptotic steady-state density approximation. Depending on the phase resetting curve, the system can undergo a supercritical Andronov-Hopf bifurcation as the recurrent coupling strength of the oscillator is increased, leading to synchronous and oscillatory population activity. The analysis is extended to include recurrent input through synapses with kinetics—it generally stabilizes the incoherent state. Moreover, input via synapses can uncover a bifurcation that does not exist with instantaneous recurrent input. The analysis is further generalized to recurrent input that is a smooth function of the population firing rate. The results are applied to a closed-loop system of neural oscillators that receive feedback mediated by a noisy population of excitable integrate-and-fire neurons. Our results extend the power of perturbation methods for dealing with equations that, a priori, appear intractable.


Siam Journal on Applied Dynamical Systems | 2014

Dynamics of Coupled Noisy Neural Oscillators with Heterogeneous Phase Resetting Curves

Cheng Ly

Pulse-coupled phase oscillators have been utilized in a variety of contexts. Motivated by neuroscience, we study a network of pulse-coupled phase oscillators receiving independent and correlated noise. An additional physiological attribute, heterogeneity, is incorporated in the phase-resetting curve (PRC), which is a vital entity for modeling the biophysical dynamics of oscillators. An accurate probability density or mean field description is large-dimensional, requiring reduction methods for tractability. We present a reduction method to capture the pairwise synchrony via the probability density of the phase differences, and we explore the robustness of the method. We find the reduced methods can capture some of the synchronous dynamics in these networks. The variance of the noisy period (or spike times) in this network is also considered. In particular, we find that phase oscillators with predominately positive PRCs (type I) have larger variance with inhibitory pulse-coupling than PRCs with a larger neg...

Collaboration


Dive into the Cheng Ly's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrea K. Barreiro

Southern Methodist University

View shared research outputs
Top Co-Authors

Avatar

Brent Doiron

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alison L. Barth

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Felix Apfaltrer

Courant Institute of Mathematical Sciences

View shared research outputs
Top Co-Authors

Avatar

Gary Marsat

West Virginia University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Seth H. Weinberg

Virginia Commonwealth University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge