Featured Researches

Neurons And Cognition

Efficient Coding in the Economics of Human Brain Connectomics

In systems neuroscience, most models posit that brain regions communicate information under constraints of efficiency. Yet, metabolic and information transfer efficiency across structural networks are not understood. In a large cohort of youth, we find metabolic costs associated with structural path strengths supporting information diffusion. Metabolism is balanced with the coupling of structures supporting diffusion and network modularity. To understand efficient network communication, we develop a theory specifying minimum rates of message diffusion that brain regions should transmit for an expected fidelity, and we test five predictions from the theory. We introduce compression efficiency, which quantifies differing trade-offs between lossy compression and communication fidelity in structural networks. Compression efficiency evolves with development, heightens when metabolic gradients guide diffusion, constrains network complexity, explains how rich-club hubs integrate information, and correlates with cortical areal scaling, myelination, and speed-accuracy trade-offs. Our findings elucidate how network structures and metabolic resources support efficient neural communication.

Read more
Neurons And Cognition

Electrical activity of fungi: Spikes detection and complexity analysis

Oyster fungi \emph{Pleurotus djamor} generate actin potential like spikes of electrical potential. The trains of spikes might manifest propagation of growing mycelium in a substrate, transportation of nutrients and metabolites and communication processes in the mycelium network. The spiking activity of the mycelium networks is highly variable compared to neural activity and therefore can not be analysed by standard tools from neuroscience. We propose original techniques for detecting and classifying the spiking activity of fungi. Using these techniques, we analyse the information-theoretic complexity of the fungal electrical activity. The results can pave ways for future research on sensorial fusion and decision making of fungi.

Read more
Neurons And Cognition

Emergence of irregular activity in networks of strongly coupled conductance-based neurons

Cortical neurons are characterized by irregular firing and a broad distribution of rates. The balanced state model explains these observations with a cancellation of mean excitatory and inhibitory currents, which makes fluctuations drive firing. In networks of neurons with current-based synapses, the balanced state emerges dynamically if coupling is strong, i.e. if the mean number of synapses per neuron K is large and synaptic efficacy is of order 1/ K − − √ . When synapses are conductance-based, current fluctuations are suppressed when coupling is strong, questioning the applicability of the balanced state idea to biological neural networks. We analyze networks of strongly coupled conductance-based neurons and show that asynchronous irregular activity and broad distributions of rates emerge if synapses are of order 1/log(K) . In such networks, unlike in the standard balanced state model, current fluctuations are small and firing is maintained by a drift-diffusion balance. This balance emerges dynamically, without fine tuning, if inputs are smaller than a critical value, which depends on synaptic time constants and coupling strength, and is significantly more robust to connection heterogeneities than the classical balanced state model. Our analysis makes experimentally testable predictions of how the network response properties should evolve as input increases.

Read more
Neurons And Cognition

End-to-End Automatic Sleep Stage Classification Using Spectral-Temporal Sleep Features

Sleep disorder is one of many neurological diseases that can affect greatly the quality of daily life. It is very burdensome to manually classify the sleep stages to detect sleep disorders. Therefore, the automatic sleep stage classification techniques are needed. However, the previous automatic sleep scoring methods using raw signals are still low classification performance. In this study, we proposed an end-to-end automatic sleep staging framework based on optimal spectral-temporal sleep features using a sleep-edf dataset. The input data were modified using a bandpass filter and then applied to a convolutional neural network model. For five sleep stage classification, the classification performance 85.6% and 91.1% using the raw input data and the proposed input, respectively. This result also shows the highest performance compared to conventional studies using the same dataset. The proposed framework has shown high performance by using optimal features associated with each sleep stage, which may help to find new features in the automatic sleep stage method.

Read more
Neurons And Cognition

End-to-End Models for the Analysis of System 1 and System 2 Interactions based on Eye-Tracking Data

While theories postulating a dual cognitive system take hold, quantitative confirmations are still needed to understand and identify interactions between the two systems or conflict events. Eye movements are among the most direct markers of the individual attentive load and may serve as an important proxy of information. In this work we propose a computational method, within a modified visual version of the well-known Stroop test, for the identification of different tasks and potential conflicts events between the two systems through the collection and processing of data related to eye movements. A statistical analysis shows that the selected variables can characterize the variation of attentive load within different scenarios. Moreover, we show that Machine Learning techniques allow to distinguish between different tasks with a good classification accuracy and to investigate more in depth the gaze dynamics.

Read more
Neurons And Cognition

Energy optimality predicts curvilinear locomotion

Everyday human locomotion requires changing directions and turning. However, while straight-line walking behavior is approximately explained by energy minimization, we do not yet have a unified theoretical account of non-straight-line (i.e., curvilinear) locomotion, despite its ecological importance. Here, we show that many non-straight-line walking phenomena are predicted by including an energy cost for turning. We quantified the cost of turning in humans, showing that the metabolic rate of walking increases with decreasing radius for fixed speed. We then used this metabolic cost to mathematically predict energy-optimal movement patterns for five tasks of varying complexity: walking in circles, turning in place, walking through an angled corridor, walking freely from point to point while having to turn, and walking through doors while maneuvering around obstacles. In these tasks, humans moved at speeds and paths approximately predicted by energy optima. Thus, we provide a unified theoretical account that predicts diverse curvilinear locomotor phenomena.

Read more
Neurons And Cognition

Entanglement in Cognition violating Bell Inequalities Beyond Cirel'son's Bound

We present the results of two tests where a sample of human participants were asked to make judgements about the conceptual combinations {\it The Animal Acts} and {\it The Animal eats the Food}. Both tests significantly violate the Clauser-Horne-Shimony-Holt version of Bell inequalities (`CHSH inequality'), thus exhibiting manifestly non-classical behaviour due to the meaning connection between the individual concepts that are combined. We then apply a quantum-theoretic framework which we developed for any Bell-type situation and represent empirical data in complex Hilbert space. We show that the observed violations of the CHSH inequality can be explained as a consequence of a strong form of `quantum entanglement' between the component conceptual entities in which both the state and measurements are entangled. We finally observe that a quantum model in Hilbert space can be elaborated in these Bell-type situations even when the CHSH violation exceeds the known `Cirel'son bound', in contrast to a widespread belief. These findings confirm and strengthen the results we recently obtained in a variety of cognitive tests and document and image retrieval operations on the same conceptual combinations.

Read more
Neurons And Cognition

Entropic Decision Making

Using results from neurobiology on perceptual decision making and value-based decision making, the problem of decision making between lotteries is reformulated in an abstract space where uncertain prospects are mapped to corresponding active neuronal representations. This mapping allows us to maximize non-extensive entropy in the new space with some constraints instead of a utility function. To achieve good agreements with behavioral data, the constraints must include at least constraints on the weighted average of the stimulus and on its variance. Both constraints are supported by the adaptability of neuronal responses to an external stimulus. By analogy with thermodynamic and information engines, we discuss the dynamics of choice between two lotteries as they are being processed simultaneously in the brain by rate equations that describe the transfer of attention between lotteries and within the various prospects of each lottery. This model is able to give new insights on risk aversion and on behavioral anomalies not accounted for by Prospect Theory.

Read more
Neurons And Cognition

Event-Based Backpropagation can compute Exact Gradients for Spiking Neural Networks

Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was previously hindered by the existence of discrete spike events and discontinuities. For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function by applying the adjoint method together with the proper partial derivative jumps, allowing for backpropagation through discrete spike events without approximations. This algorithm, EventProp, backpropagates errors at spike times in order to compute the exact gradient in an event-based, temporally and spatially sparse fashion. We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance. Our work supports the rigorous study of gradient-based learning algorithms in spiking neural networks and provides insights toward their implementation in novel brain-inspired hardware.

Read more
Neurons And Cognition

Event-based update of synapses in voltage-based learning rules

Due to the point-like nature of neuronal spiking, efficient neural network simulators often employ event-based simulation schemes for synapses. Yet many types of synaptic plasticity rely on the membrane potential of the postsynaptic cell as a third factor in addition to pre- and postsynaptic spike times. Synapses therefore require continuous information to update their strength which a priori necessitates a continuous update in a time-driven manner. The latter hinders scaling of simulations to realistic cortical network sizes and relevant time scales for learning. Here, we derive two efficient algorithms for archiving postsynaptic membrane potentials, both compatible with modern simulation engines based on event-based synapse updates. We theoretically contrast the two algorithms with a time-driven synapse update scheme to analyze advantages in terms of memory and computations. We further present a reference implementation in the spiking neural network simulator NEST for two prototypical voltage-based plasticity rules: the Clopath rule and the Urbanczik-Senn rule. For both rules, the two event-based algorithms significantly outperform the time-driven scheme. Depending on the amount of data to be stored for plasticity, which heavily differs between the rules, a strong performance increase can be achieved by compressing or sampling of information on membrane potentials. Our results on computational efficiency related to archiving of information provide guidelines for the design of learning rules in order to make them practically usable in large-scale networks.

Read more

Ready to get started?

Join us today