Fernando Montani
National University of La Plata
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Fernando Montani.
Philosophical Transactions of the Royal Society A | 2009
Fernando Montani; Robin A. A. Ince; Riccardo Senatore; Ehsan Arabzadeh; Mathew E. Diamond; Stefano Panzeri
Understanding the operations of neural networks in the brain requires an understanding of whether interactions among neurons can be described by a pairwise interaction model, or whether a higher order interaction model is needed. In this article we consider the rate of synchronous discharge of a local population of neurons, a macroscopic index of the activation of the neural network that can be measured experimentally. We analyse a model based on physics’ maximum entropy principle that evaluates whether the probability of synchronous discharge can be described by interactions up to any given order. When compared with real neural population activity obtained from the rat somatosensory cortex, the model shows that interactions of at least order three or four are necessary to explain the data. We use Shannon information to compute the impact of high-order correlations on the amount of somatosensory information transmitted by the rate of synchronous discharge, and we find that correlations of higher order progressively decrease the information available through the neural population. These results are compatible with the hypothesis that high-order interactions play a role in shaping the dynamics of neural networks, and that they should be taken into account when computing the representational capacity of neural populations.
Neural Networks | 2010
Robin A. A. Ince; Riccardo Senatore; Ehsan Arabzadeh; Fernando Montani; Mathew E. Diamond; Stefano Panzeri
Population coding is the quantitative study of which algorithms or representations are used by the brain to combine together and evaluate the messages carried by different neurons. Here, we review an information-theoretic approach to population coding. We first discuss how to compute the information carried by simultaneously recorded neural populations, and in particular how to reduce the limited sampling bias which affects the calculation of information from a limited amount of experimental data. We then discuss how to quantify the contribution of individual members of the population, or the interaction between them, to the overall information encoded by the considered group of neurons. We focus in particular on evaluating what is the contribution of interactions up to any given order to the total information. We illustrate this formalism with applications to simulated data with realistic neuronal statistics and to real simultaneous recordings of multiple spike trains.
Physica A-statistical Mechanics and Its Applications | 2014
Fernando Montani; Emilia B. Deleglise; Osvaldo A. Rosso
When inhibitory neurons constitute about 40% of neurons they could have an important antinociceptive role, as they would easily regulate the level of activity of other neurons. We consider a simple network of cortical spiking neurons with axonal conduction delays and spike timing dependent plasticity, representative of a cortical column or hypercolumn with a large proportion of inhibitory neurons. Each neuron fires following a Hodgkin–Huxley like dynamics and it is interconnected randomly to other neurons. The network dynamics is investigated estimating Bandt and Pompe probability distribution function associated to the interspike intervals and taking different degrees of interconnectivity across neurons. More specifically we take into account the fine temporal “structures” of the complex neuronal signals not just by using the probability distributions associated to the interspike intervals, but instead considering much more subtle measures accounting for their causal information: the Shannon permutation entropy, Fisher permutation information and permutation statistical complexity. This allows us to investigate how the information of the system might saturate to a finite value as the degree of interconnectivity across neurons grows, inferring the emergent dynamical properties of the system.
Entropy | 2014
Fernando Montani; Osvaldo A. Rosso
Abstract: Electroencephalography (EEG) reflects the electrical activity of the brain, whichcan be considered chaotic and ruled by a nonlinear dynamics. Chickens exhibit a protractedperiod of maturation, and this temporal separation of the synapse formation and maturationphases is analogous to human neural development, though the changes in chickens occur inweeks compared to years in humans. The development of synaptic networks in the chickenbrain can be regarded as occurring in two broadly defined phases. We specifically describethe chicken brain development phases in the causality entropy-complexity plane H C ,showing that the complexity of the electrical activity can be characterized by estimating theintrinsic correlational structure of the EEG signal. This allows us to identify the dynamicsof the developing chicken brain within the zone of a chaotic dissipative behavior in the plane H C .Keywords: EEG signals; statistical complexity measure; chicken brain; neural maturationPACS classifications: 02.50.-r; 05.45. Tp;87.19.La
Physica A-statistical Mechanics and Its Applications | 2013
Fernando Montani; Elena Phoka; Mariela Portesi; Simon R. Schultz
Simultaneous recordings from multiple neural units allow us to investigate the activity of very large neural ensembles. To understand how large ensembles of neurons process sensory information, it is necessary to develop suitable statistical models to describe the response variability of the recorded spike trains. Using the information geometry framework, it is possible to estimate higher-order correlations by assigning one interaction parameter to each degree of correlation, leading to a (2N−1)-dimensional model for a population with N neurons. However, this model suffers greatly from a combinatorial explosion, and the number of parameters to be estimated from the available sample size constitutes the main intractability reason of this approach. To quantify the extent of higher than pairwise spike correlations in pools of multiunit activity, we use an information-geometric approach within the framework of the extended central limit theorem considering all possible contributions from higher-order spike correlations. The identification of a deformation parameter allows us to provide a statistical characterisation of the amount of higher-order correlations in the case of a very large neural ensemble, significantly reducing the number of parameters, avoiding the sampling problem, and inferring the underlying dynamical properties of the network within pools of multiunit neural activity.
Philosophical Transactions of the Royal Society A | 2015
Fernando Montani; Osvaldo A. Rosso; Fernanda S. Matias; Steven L. Bressler; Claudio R. Mirasso
The phenomenon of synchronization between two or more areas of the brain coupled asymmetrically is a relevant issue for understanding mechanisms and functions within the cerebral cortex. Anticipated synchronization (AS) refers to the situation in which the receiver system synchronizes to the future dynamics of the sender system while the intuitively expected delayed synchronization (DS) represents exactly the opposite case. AS and DS are investigated in the context of causal information formalism. More specifically, we use a multi-scale symbolic information-theory approach for discriminating the time delay displayed between two areas of the brain when they exchange information.
Philosophical Transactions of the Royal Society A | 2015
Fernando Montani; Roman Baravalle; Lisandro Montangie; Osvaldo A. Rosso
Neurons tend to fire a spike when they are near a bifurcation from the resting state to spiking activity. It is a delicate balance between noise, dynamic currents and initial condition that determines the phase diagram of neural activity. Many possible ionic mechanisms can be accounted for as the source of spike generation. Moreover, the biophysics and the dynamics behind it can usually be described through a phase diagram that involves membrane voltage versus the activation variable of the ionic channel. In this paper, we present a novel methodology to characterize the dynamics of this system, which takes into account the fine temporal ‘structures’ of the complex neuronal signals. This allows us to accurately distinguish the most fundamental properties of neurophysiological neurons that were previously described by Izhikevich considering the phase-space trajectory, using a time causal space: statistical complexity versus Fisher information versus Shannon entropy.
Journal of Physics: Conference Series | 2009
Robin A. A. Ince; Fernando Montani; Ehsan Arabzadeh; Mathew E. Diamond; Stefano Panzeri
In order to understand how populations of neurons encode information about external correlates, it is important to develop minimal models of the probability of neural population responses which capture all the salient changes of neural responses with stimuli. In this context, it is particularly useful to determine whether interactions among neurons responding to stimuli can be described by a pairwise interaction model, or whether a higher order interaction model is needed. To address this question, we compared real neural population activity obtained from the rat somatosensory cortex to maximum-entropy models which take into account only interaction of up any given order. By performing these comparisons, we found that interactions of order two were sufficient to explain a large amount of observed stimulus-response distributions, but not all of them. Triple-wise interactions were necessary to fully explain the data. We then used Shannon information to compute the impact of high order correlations on the amount of somatosensory information transmitted by the neural population. We found that correlations of order two gave a good approximation of information carried by the neural population, within 4% of the true value. Third order correlations gave an even better approximation, within 2% of the true value. Taken together, these results suggest that higher order interactions exist and shape the dynamics of cortical networks, but play a quantitatively minor role in determining the information capacity of neural populations.
Chaos | 2018
Roman Baravalle; Osvaldo A. Rosso; Fernando Montani
Electroencephalography (EEG) signals depict the electrical activity that takes place at the surface of the brain and provide an important tool for understanding a variety of cognitive processes. The EEG is the product of synchronized activity of the brain, and variations in EEG oscillations patterns reflect the underlying changes in neuronal synchrony. Our aim is to characterize the complexity of the EEG rhythmic oscillations bands when the subjects perform a visuomotor or imagined cognitive tasks (imagined movement), providing a causal mapping of the dynamical rhythmic activities of the brain as a measure of attentional investment. We estimate the intrinsic correlational structure of the signals within the causality entropy-complexity plane H×C, where the enhanced complexity in the gamma 1, gamma 2, and beta 1 bands allows us to distinguish motor-visual memory tasks from control conditions. We identify the dynamics of the gamma 1, gamma 2, and beta 1 rhythmic oscillations within the zone of a chaotic dissipative behavior, whereas in contrast the beta 2 band shows a much higher level of entropy and a significant low level of complexity that correspond to a non-invertible cubic map. Our findings enhance the importance of the gamma band during attention in perceptual feature binding during the visuomotor/imagery tasks.
Entropy | 2018
Roman Baravalle; Osvaldo A. Rosso; Fernando Montani
The electroencephalogram (EEG) is an electrophysiological monitoring method that allows us to glimpse the electrical activity of the brain. Neural oscillations patterns are perhaps the best salient feature of EEG as they are rhythmic activities of the brain that can be generated by interactions across neurons. Large-scale oscillations can be measured by EEG as the different oscillation patterns reflected within the different frequency bands, and can provide us with new insights into brain functions. In order to understand how information about the rhythmic activity of the brain during visuomotor/imagined cognitive tasks is encoded in the brain we precisely quantify the different features of the oscillatory patterns considering the Shannon–Fisher plane H×F. This allows us to distinguish the dynamics of rhythmic activities of the brain showing that the Beta band facilitate information transmission during visuomotor/imagined tasks.