Milad Lankarany
Concordia University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Milad Lankarany.
Frontiers in Cellular Neuroscience | 2015
Stéphanie Ratté; Milad Lankarany; Young-Ah Rho; Adam Patterson; Steven A. Prescott
Neurons rely on action potentials, or spikes, to encode information. But spikes can encode different stimulus features in different neurons. We show here through simulations and experiments how neurons encode the integral or derivative of their input based on the distinct tuning properties conferred upon them by subthreshold currents. Slow-activating subthreshold inward (depolarizing) current mediates positive feedback control of subthreshold voltage, sustaining depolarization and allowing the neuron to spike on the basis of its integrated stimulus waveform. Slow-activating subthreshold outward (hyperpolarizing) current mediates negative feedback control of subthreshold voltage, truncating depolarization and forcing the neuron to spike on the basis of its differentiated stimulus waveform. Depending on its direction, slow-activating subthreshold current cooperates or competes with fast-activating inward current during spike initiation. This explanation predicts that sensitivity to the rate of change of stimulus intensity differs qualitatively between integrators and differentiators. This was confirmed experimentally in spinal sensory neurons that naturally behave as specialized integrators or differentiators. Predicted sensitivity to different stimulus features was confirmed by covariance analysis. Integration and differentiation, which are themselves inverse operations, are thus shown to be implemented by the slow feedback mediated by oppositely directed subthreshold currents expressed in different neurons.
Frontiers in Computational Neuroscience | 2013
Milad Lankarany; Wei-Ping Zhu; M.N.S. Swamy; Taro Toyoizumi
Time-varying excitatory and inhibitory synaptic inputs govern activity of neurons and process information in the brain. The importance of trial-to-trial fluctuations of synaptic inputs has recently been investigated in neuroscience. Such fluctuations are ignored in the most conventional techniques because they are removed when trials are averaged during linear regression techniques. Here, we propose a novel recursive algorithm based on Gaussian mixture Kalman filtering (GMKF) for estimating time-varying excitatory and inhibitory synaptic inputs from single trials of noisy membrane potential in current clamp recordings. The KF is followed by an expectation maximization (EM) algorithm to infer the statistical parameters (time-varying mean and variance) of the synaptic inputs in a non-parametric manner. As our proposed algorithm is repeated recursively, the inferred parameters of the mixtures are used to initiate the next iteration. Unlike other recent algorithms, our algorithm does not assume an a priori distribution from which the synaptic inputs are generated. Instead, the algorithm recursively estimates such a distribution by fitting a Gaussian mixture model (GMM). The performance of the proposed algorithms is compared to a previously proposed PF-based algorithm (Paninski et al., 2012) with several illustrative examples, assuming that the distribution of synaptic input is unknown. If noise is small, the performance of our algorithms is similar to that of the previous one. However, if noise is large, they can significantly outperform the previous proposal. These promising results suggest that our algorithm is a robust and efficient technique for estimating time varying excitatory and inhibitory synaptic conductances from single trials of membrane potential recordings.
Neurocomputing | 2014
Milad Lankarany; Wei-Ping Zhu; M.N.S. Swamy
Abstract Fitting biophysical models to real noisy data jointly with extracting fundamental biophysical parameters has recently stimulated tremendous studies in computational neuroscience. Hodgkin–Huxley (HH) neuronal model has been considered as the most detailed biophysical model for representing the dynamical behavior of the spiking neurons. The unscented Kalman filter (UKF) has been already applied to track the dynamics of the HH model. In this paper, we extend the existing Kalman filtering (KF) technique for the HH model to another version, namely, extended Kalman filtering (EKF). Two estimation strategies of the KF, dual and joint estimation strategies, are derived for simultaneously tracking the hidden dynamics and estimating the unknown parameters of a single neuron, leading to four KF algorithms, namely, joint UKF (JUKF), dual UKF (DUKF), joint EKF (JEKF) and dual EKF (DEKF). Detailed derivations of these four methods and intensive simulation studies are provided. Our main contribution is in the derivation of the EKF-based methods, JEKF and DEKF, for tracking the states and estimating the parameters of the HH neuronal model. Since these EKF-based methods are faster than UKF-based versions, they can particularly be employed in dynamic clamp technique, which connects artificial and biological neurons in order to assess the function of neuronal circuits.
international symposium on circuits and systems | 2013
Milad Lankarany; Wei-Ping Zhu; M.N.S. Swamy
Fitting biophysical models to real noisy data jointly with extracting fundamental biophysical parameters has recently stimulated tremendous studies in computational neuroscience. Hodgkin-Huxley (HH) neuronal model has been considered as the most detailed biophysical model for representing the dynamical behavior of the spiking neurons. In this paper, we derive, for the first time, the dual extended Kalman filtering (DEKF) approach for the HH neuronal model to track the dynamics and estimate the parameters of a single neuron from noisy recorded membrane voltage. As unscented Kalman filter (UKF) has been already applied to the HH model, a quantitative comparison between these methods is accomplished in our simulation for different signal to observation noise ratios. Our simulations demonstrate the high accuracy of DEKF in the prediction and estimation of hidden states and unknown parameters of the HH neuronal model. Faster implementation of DEKF (than UKF) makes it particularly useful in dynamic clamp technique.
Cerebral Cortex | 2016
Ayah Khubieh; Stéphanie Ratté; Milad Lankarany; Steven A. Prescott
The cortex encodes a broad range of inputs. This breadth of operation requires sensitivity to weak inputs yet non-saturating responses to strong inputs. If individual pyramidal neurons were to have a narrow dynamic range, as previously claimed, then staggered all-or-none recruitment of those neurons would be necessary for the population to achieve a broad dynamic range. Contrary to this explanation, we show here through dynamic clamp experiments in vitro and computer simulations that pyramidal neurons have a broad dynamic range under the noisy conditions that exist in the intact brain due to background synaptic input. Feedforward inhibition capitalizes on those noise effects to control neuronal gain and thereby regulates the population dynamic range. Importantly, noise allows neurons to be recruited gradually and occludes the staggered recruitment previously attributed to heterogeneous excitation. Feedforward inhibition protects spike timing against the disruptive effects of noise, meaning noise can enable the gain control required for rate coding without compromising the precise spike timing required for temporal coding.
Frontiers in Computational Neuroscience | 2016
Milad Lankarany; Jaime E. Heiss; Ilan Lampl; Taro Toyoizumi
Advanced statistical methods have enabled trial-by-trial inference of the underlying excitatory and inhibitory synaptic conductances (SCs) of membrane-potential recordings. Simultaneous inference of both excitatory and inhibitory SCs sheds light on the neural circuits underlying the neural activity and advances our understanding of neural information processing. Conventional Bayesian methods can infer excitatory and inhibitory SCs based on a single trial of observed membrane potential. However, if multiple recorded trials are available, this typically leads to suboptimal estimation because they neglect common statistics (of synaptic inputs (SIs)) across trials. Here, we establish a new expectation maximization (EM) algorithm that improves such single-trial Bayesian methods by exploiting multiple recorded trials to extract common SI statistics across the trials. In this paper, the proposed EM algorithm is embedded in parallel Kalman filters or particle filters for multiple recorded trials to integrate their outputs to iteratively update the common SI statistics. These statistics are then used to infer the excitatory and inhibitory SCs of individual trials. We demonstrate the superior performance of multiple-trial Kalman filtering (MtKF) and particle filtering (MtPF) relative to that of the corresponding single-trial methods. While relative estimation error of excitatory and inhibitory SCs is known to depend on the level of current injection into a cell, our numerical simulations using MtKF show that both excitatory and inhibitory SCs are reliably inferred using an optimal level of current injection. Finally, we validate the robustness and applicability of our technique through simulation studies, and we apply MtKF to in vivo data recorded from the rat barrel cortex.
international conference of the ieee engineering in medicine and biology society | 2013
Milad Lankarany; Wei-Ping Zhu; M.N.S. Swamy; Taro Toyoizumi
Neuron transforms information via a complex interaction between its previous states, its intrinsic properties, and the synaptic input it receives from other neurons. Inferring synaptic input of a neuron only from its membrane potential (output) that contains both sub-threshold and action potentials can effectively elucidate the information processing mechanism of a neuron. The term coined blind deconvolution of Hodgkin-Huxley (HH) neuronal model is defined, for the first time in this paper, to address the problem of reconstructing the hidden dynamics and synaptic input of a single neuron modeled by the HH model as well as estimating its intrinsic parameters only from single trace of noisy membrane potential. The blind deconvolution is accomplished via a recursive algorithm whose iterations contain running an extended Kalman filtering followed by the expectation maximization (EM) algorithm. The accuracy and robustness of the proposed algorithm have been demonstrated by our simulations. The capability of the proposed algorithm makes it particularly useful to understand the neural coding mechanism of a neuron.
information sciences, signal processing and their applications | 2010
Milad Lankarany; Mohammad Hasan Savoji
In this paper, we model the voiced speech signal as an AR process with an AR filter whose coefficients are obtained using a new iterative model-based algorithm. In the proposed iterative algorithm the Liljencrants-Fant (LF) model of the glottal flow is fitted, at each iteration, to the glottal derivative waveform extracted by closed phase inverse filtering. Taking this signal as the desired output of an adaptive filter excited by speech, the inverse of the AR filter is calculated using a normalized LMS algorithm. The mean square error is consequently minimized between the resulting residual and the LF model. The next iteration begins by obtaining a new LF model to fit the residual signal obtained by filtering speech with the up-dated filter. Therefore, a new estimation of the glottal flow derivative waveform is obtained at each iteration. The algorithm stops when no considerable changes occur, in two consecutive iterations, in the glottal flow derivative. Finally, the glottal flow estimates of real voiced speech sounds /a/ and /e/ are given as examples.
BMC Neuroscience | 2015
Milad Lankarany; Steven A. Prescott
The prodigious capacity of our brain to process information relies on efficient neural coding strategies. In engineered systems, bandwidth is often increased through multiplexing multiple signals are simultaneously, yet independently, transmitted through a single communication channel. We have proposed previously that neural systems might implement the same sort of solution [1]. Here, we tested if/how multiplexed coding could be achieved through combined rate and temporal coding. We hypothesized that a set of neurons could independently encode two signals by using asynchronous spike rate to encode one signal and synchronous spike timing to encode the other. To test our hypothesis, we built a feed-forward neural network comprising Morris-Lecar (ML) model neurons. All neurons received a common input constructed from two distinct signals, slow and fast, plus uncorrelated fast noise. According to our hypothesis, slow and fast signals are independently encoded by different types of spikes; in other words, differentially correlated output spikes, namely, asynchronous (Async) and synchronous (Sync), enable encoding of slow and fast signals, respectively. To assess the feasibility of the multiplexed coding scheme, recorded spikes were classified into two independent classes based on the peristimulus time histogram (PSTH) calculated from the entire set of neurons. Spikes whose instantaneous rates exceeded a threshold were designated “Sync” and all others were designated “Async”. The spike triggered average (STA) was calculated for slow and fast signals using Sync and Async spikes, resulting in four different STAs. The Async-slow and Sync-fast STAs were clearly structured whereas the other two were not (Figure 1A). Using
BMC Neuroscience | 2013
Milad Lankarany; Wei-Ping Zhu; Mns Swamy; Taro Toyoizumi
Interaction of the excitatory and inhibitory synaptic inputs constructs the shape of the receptive fields and can elucidate the synaptic mechanism underlying the functional activities of neurons. Estimating trial-to-trial excitatory and inhibitory synaptic conductance from noisy observation of membrane potential or input current can reveal drivers of neurons and play an important role in our understanding of information processing in neuronal circuits. Although recent studies introduced statistical methods that estimate trial-to-trial variation of synaptic conductance [1,2], most previous works use the well-known least square (LS) method to estimate the excitatory and inhibitory synaptic conductance from the trial-mean of recorded traces of membrane potential or input current [3-5]. We first analytically show that the LS method is not only incompetent to capture trial-to-trial variation of synaptic conductance but also provide biased estimation of synaptic conductance and excitatory/inhibitory covariance if fluctuation of synaptic conductance and membrane potential is correlated. Next, we propose a novel method based on Gaussian mixture Kalman filtering (GMKF) that not only overcomes the aforementioned limitations of the LS method but also gives the opportunity of trial-to-trial estimation of the excitatory and inhibitory synaptic conductance. We show that our proposal requires fewer assumptions than the recent proposals [1,2] that also provide trial-to-trial estimation of synaptic conductance. In particular, the proposed technique outperforms [1] by providing the ability of estimating an unknown synaptic distribution using Gaussian mixture model (GMM). We believe that our findings have a significant influence on our understanding of the balance of excitatory and inhibitory synaptic input and the underlying cortical circuitry. Figure 1 Estimating excitatory and inhibitory synaptic conductance using LS Voltage-clamp (left) and GMKF (right). LS method cannot estimate trial-to-trial synaptic conductances (10 trials each lasted 1 sec is used to provide data for both methods). True values ...