Featured Researches

Neurons And Cognition

CalciumGAN: A Generative Adversarial Network Model for Synthesising Realistic Calcium Imaging Data of Neuronal Populations

Calcium imaging has become a powerful and popular technique to monitor the activity of large populations of neurons in vivo. However, for ethical considerations and despite recent technical developments, recordings are still constrained to a limited number of trials and animals. This limits the amount of data available from individual experiments and hinders the development of analysis techniques and models for more realistic size of neuronal populations. The ability to artificially synthesize realistic neuronal calcium signals could greatly alleviate this problem by scaling up the number of trials. Here we propose a Generative Adversarial Network (GAN) model to generate realistic calcium signals as seen in neuronal somata with calcium imaging. To this end, we adapt the WaveGAN architecture and train it with the Wasserstein distance. We test the model on artificial data with known ground-truth and show that the distribution of the generated signals closely resembles the underlying data distribution. Then, we train the model on real calcium signals recorded from the primary visual cortex of behaving mice and confirm that the deconvolved spike trains match the statistics of the recorded data. Together, these results demonstrate that our model can successfully generate realistic calcium imaging data, thereby providing the means to augment existing datasets of neuronal activity for enhanced data exploration and modeling.

Read more
Neurons And Cognition

Can Single Neurons Solve MNIST? The Computational Power of Biological Dendritic Trees

Physiological experiments have highlighted how the dendrites of biological neurons can nonlinearly process distributed synaptic inputs. This is in stark contrast to units in artificial neural networks that are generally linear apart from an output nonlinearity. If dendritic trees can be nonlinear, biological neurons may have far more computational power than their artificial counterparts. Here we use a simple model where the dendrite is implemented as a sequence of thresholded linear units. We find that such dendrites can readily solve machine learning problems, such as MNIST or CIFAR-10, and that they benefit from having the same input onto several branches of the dendritic tree. This dendrite model is a special case of sparse network. This work suggests that popular neuron models may severely underestimate the computational power enabled by the biological fact of nonlinear dendrites and multiple synapses per pair of neurons. The next generation of artificial neural networks may significantly benefit from these biologically inspired dendritic architectures.

Read more
Neurons And Cognition

Causality in cognitive neuroscience: concepts, challenges, and distributional robustness

While probabilistic models describe the dependence structure between observed variables, causal models go one step further: they predict, for example, how cognitive functions are affected by external interventions that perturb neuronal activity. In this review and perspective article, we introduce the concept of causality in the context of cognitive neuroscience and review existing methods for inferring causal relationships from data. Causal inference is an ambitious task that is particularly challenging in cognitive neuroscience. We discuss two difficulties in more detail: the scarcity of interventional data and the challenge of finding the right variables. We argue for distributional robustness as a guiding principle to tackle these problems. Robustness (or invariance) is a fundamental principle underlying causal methodology. A causal model of a target variable generalises across environments or subjects as long as these environments leave the causal mechanisms intact. Consequently, if a candidate model does not generalise, then either it does not consist of the target variable's causes or the underlying variables do not represent the correct granularity of the problem. In this sense, assessing generalisability may be useful when defining relevant variables and can be used to partially compensate for the lack of interventional data.

Read more
Neurons And Cognition

Cerebral cortical communication overshadows computational energy-use, but these combine to predict synapse number

Darwinian evolution tends to produce energy-efficient outcomes. On the other hand, energy limits computation, be it neural and probabilistic or digital and logical. Taking a particular energy-efficient viewpoint, we define neural computation and make use of an energy-constrained, computational function. This function can be optimized over a variable that is proportional to the number of synapses per neuron. This function also implies a specific distinction between ATP-consuming processes, especially computation \textit{per se} vs the communication processes including action potentials and transmitter release. Thus to apply this mathematical function requires an energy audit with a partitioning of energy consumption that differs from earlier work. The audit points out that, rather than the oft-quoted 20 watts of glucose available to the brain \cite{sokoloff1960metabolism,sawada2013synapse}, the fraction partitioned to cortical computation is only 0.1 watts of ATP. On the other hand at 3.5 watts, long-distance communication costs are 35-fold greater. Other novel quantifications include (i) a finding that the biological vs ideal values of neural computational efficiency differ by a factor of 10 8 and (ii) two predictions of N , the number of synaptic transmissions needed to fire a neuron (2500 vs 2000).

Read more
Neurons And Cognition

Chaos may enhance expressivity in cerebellar granular layer

Recent evidence suggests that Golgi cells in the cerebellar granular layer are densely connected to each other with massive gap junctions. Here, we propose that the massive gap junctions between the Golgi cells contribute to the representational complexity of the granular layer of the cerebellum by inducing chaotic dynamics. We construct a model of cerebellar granular layer with diffusion coupling through gap junctions between the Golgi cells, and evaluate the representational capability of the network with the reservoir computing framework. First, we show that the chaotic dynamics induced by diffusion coupling results in complex output patterns containing a wide range of frequency components. Second, the long non-recursive time series of the reservoir represents the passage of time from an external input. These properties of the reservoir enable mapping different spatial inputs into different temporal patterns.

Read more
Neurons And Cognition

Chaos stabilizes synchronization in systems of coupled inner-ear hair cells

Hair cells of the auditory and vestibular systems display astonishing sensitivity, frequency selectivity, and temporal resolution to external signals. These specialized cells utilize an internal active amplifier to achieve highly sensitive mechanical detection. One of the manifestations of this active process is the occurrence of spontaneous limit-cycle motion of the hair cell bundle. As hair bundles under in vivo conditions are typically coupled to each other by overlying structures, we explore the role of this coupling on the dynamics of the system, using a combination of theoretical and experimental approaches. Our numerical model suggests that the presence of chaotic dynamics in the response of individual bundles enhances their ability to synchronize when coupled, resulting in significant improvement in the system's ability to detect weak signals. This synchronization persists even for a large frequency dispersion and a large number of oscillators comprising the system. Further, the amplitude and coherence of the active motion is not reduced upon increasing the number of oscillators. Using artificial membranes, we impose mechanical coupling on groups of live and functional hair bundles, selected from in vitro preparations of the sensory epithelium, allowing us to explore the role of coupling experimentally. Consistent with the numerical simulations of the chaotic system, synchronization occurs even for large frequency dispersion and a large number of hair cells. Further, the amplitude and coherence of the spontaneous oscillations are independent of the number of hair cells in the network. We therefore propose that hair cells utilize their chaotic dynamics to stabilize the synchronized state and avoid the amplitude death regime, resulting in collective coherent motion that could play a role in generating spontaneous otoacoustic emissions and an enhanced ability to detect weak signals.

Read more
Neurons And Cognition

Characterising Alzheimer's Disease with EEG-based Energy Landscape Analysis

Alzheimer's disease (AD) is one of the most common neurodegenerative diseases, with around 50 million patients worldwide. Accessible and non-invasive methods of diagnosing and characterising AD are therefore urgently required. Electroencephalography (EEG) fulfils these criteria and is often used when studying AD. Several features derived from EEG were shown to predict AD with high accuracy, e.g. signal complexity and synchronisation. However, the dynamics of how the brain transitions between stable states have not been properly studied in the case of AD and EEG data. Energy landscape analysis is a method that can be used to quantify these dynamics. This work presents the first application of this method to both AD and EEG. Energy landscape assigns energy value to each possible state, i.e. pattern of activations across brain regions. The energy is inversely proportional to the probability of occurrence. By studying the features of energy landscapes of 20 AD patients and 20 healthy age-matched counterparts, significant differences were found. The dynamics of AD patients' brain networks were shown to be more constrained - with more local minima, less variation in basin size, and smaller basins. We show that energy landscapes can predict AD with high accuracy, performing significantly better than baseline models.

Read more
Neurons And Cognition

Characterising the Non-Equilibrium Dynamics of a Neural Cell

We examine the dynamical evolution of the state of a neurone, with particular care to the non-equilibrium nature of the forces influencing its movement in state space. We combine non-equilibrium statistical mechanics and dynamical systems theory to characterise the nature of the neural resting state, and its relationship to firing. The stereotypical shape of the action potential arises from this model, as well as bursting dynamics, and the non-equilibrium phase transition from resting to spiking. Geometric properties of the system are discussed, such as the birth and shape of the neural limit cycle, which provide a complementary understanding of these dynamics. This provides a multiscale model of the neural cell, from molecules to spikes, and explains various phenomena in a unified manner. Some more general notions for damped oscillators, birth-death processes, and stationary non-equilibrium systems are included.

Read more
Neurons And Cognition

Characterizing spreading dynamics of subsampled systems with non-stationary external input

Many systems with propagation dynamics, such as spike propagation in neural networks and spreading of infectious diseases, can be approximated by autoregressive models. The estimation of model parameters can be complicated by the experimental limitation that one observes only a fraction of the system (subsampling) and potentially time-dependent parameters, leading to incorrect estimates. We show analytically how to overcome the subsampling bias when estimating the propagation rate for systems with certain non-stationary external input. This approach is readily applicable to trial-based experimental setups and seasonal fluctuations, as demonstrated on spike recordings from monkey prefrontal cortex and spreading of norovirus and measles.

Read more
Neurons And Cognition

Classification of Schizophrenia from Functional MRI Using Large-scale Extended Granger Causality

The literature manifests that schizophrenia is associated with alterations in brain network connectivity. We investigate whether large-scale Extended Granger Causality (lsXGC) can capture such alterations using resting-state fMRI data. Our method utilizes dimension reduction combined with the augmentation of source time-series in a predictive time-series model for estimating directed causal relationships among fMRI time-series. The lsXGC is a multivariate approach since it identifies the relationship of the underlying dynamic system in the presence of all other time-series. Here lsXGC serves as a biomarker for classifying schizophrenia patients from typical controls using a subset of 62 subjects from the Centers of Biomedical Research Excellence (COBRE) data repository. We use brain connections estimated by lsXGC as features for classification. After feature extraction, we perform feature selection by Kendall's tau rank correlation coefficient followed by classification using a support vector machine. As a reference method, we compare our results with cross-correlation, typically used in the literature as a standard measure of functional connectivity. We cross-validate 100 different training/test (90%/10%) data split to obtain mean accuracy and a mean Area Under the receiver operating characteristic Curve (AUC) across all tested numbers of features for lsXGC. Our results demonstrate a mean accuracy range of [0.767, 0.940] and a mean AUC range of [0.861, 0.983] for lsXGC. The result of lsXGC is significantly higher than the results obtained with the cross-correlation, namely mean accuracy of [0.721, 0.751] and mean AUC of [0.744, 0.860]. Our results suggest the applicability of lsXGC as a potential biomarker for schizophrenia.

Read more

Ready to get started?

Join us today