Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robin A. A. Ince is active.

Publication


Featured researches published by Robin A. A. Ince.


Current Biology | 2015

Frontal top-down signals increase coupling of auditory low-frequency oscillations to continuous speech in human listeners.

Hyojin Park; Robin A. A. Ince; Philippe G. Schyns; Gregor Thut; Joachim Gross

Summary Humans show a remarkable ability to understand continuous speech even under adverse listening conditions. This ability critically relies on dynamically updated predictions of incoming sensory information, but exactly how top-down predictions improve speech processing is still unclear. Brain oscillations are a likely mechanism for these top-down predictions [1, 2]. Quasi-rhythmic components in speech are known to entrain low-frequency oscillations in auditory areas [3, 4], and this entrainment increases with intelligibility [5]. We hypothesize that top-down signals from frontal brain areas causally modulate the phase of brain oscillations in auditory cortex. We use magnetoencephalography (MEG) to monitor brain oscillations in 22 participants during continuous speech perception. We characterize prominent spectral components of speech-brain coupling in auditory cortex and use causal connectivity analysis (transfer entropy) to identify the top-down signals driving this coupling more strongly during intelligible speech than during unintelligible speech. We report three main findings. First, frontal and motor cortices significantly modulate the phase of speech-coupled low-frequency oscillations in auditory cortex, and this effect depends on intelligibility of speech. Second, top-down signals are significantly stronger for left auditory cortex than for right auditory cortex. Third, speech-auditory cortex coupling is enhanced as a function of stronger top-down signals. Together, our results suggest that low-frequency brain oscillations play a role in implementing predictive top-down control during continuous speech perception and that top-down control is largely directed at left auditory cortex. This suggests a close relationship between (left-lateralized) speech production areas and the implementation of top-down control in continuous speech perception.


Philosophical Transactions of the Royal Society A | 2009

The impact of high-order interactions on the rate of synchronous discharge and information transmission in somatosensory cortex

Fernando Montani; Robin A. A. Ince; Riccardo Senatore; Ehsan Arabzadeh; Mathew E. Diamond; Stefano Panzeri

Understanding the operations of neural networks in the brain requires an understanding of whether interactions among neurons can be described by a pairwise interaction model, or whether a higher order interaction model is needed. In this article we consider the rate of synchronous discharge of a local population of neurons, a macroscopic index of the activation of the neural network that can be measured experimentally. We analyse a model based on physics’ maximum entropy principle that evaluates whether the probability of synchronous discharge can be described by interactions up to any given order. When compared with real neural population activity obtained from the rat somatosensory cortex, the model shows that interactions of at least order three or four are necessary to explain the data. We use Shannon information to compute the impact of high-order correlations on the amount of somatosensory information transmitted by the rate of synchronous discharge, and we find that correlations of higher order progressively decrease the information available through the neural population. These results are compatible with the hypothesis that high-order interactions play a role in shaping the dynamics of neural networks, and that they should be taken into account when computing the representational capacity of neural populations.


PLOS Computational Biology | 2012

Analysis of slow (theta) oscillations as a potential temporal reference frame for information coding in sensory cortices.

Christoph Kayser; Robin A. A. Ince; Stefano Panzeri

While sensory neurons carry behaviorally relevant information in responses that often extend over hundreds of milliseconds, the key units of neural information likely consist of much shorter and temporally precise spike patterns. The mechanisms and temporal reference frames by which sensory networks partition responses into these shorter units of information remain unknown. One hypothesis holds that slow oscillations provide a network-intrinsic reference to temporally partitioned spike trains without exploiting the millisecond-precise alignment of spikes to sensory stimuli. We tested this hypothesis on neural responses recorded in visual and auditory cortices of macaque monkeys in response to natural stimuli. Comparing different schemes for response partitioning revealed that theta band oscillations provide a temporal reference that permits extracting significantly more information than can be obtained from spike counts, and sometimes almost as much information as obtained by partitioning spike trains using precisely stimulus-locked time bins. We further tested the robustness of these partitioning schemes to temporal uncertainty in the decoding process and to noise in the sensory input. This revealed that partitioning using an oscillatory reference provides greater robustness than partitioning using precisely stimulus-locked time bins. Overall, these results provide a computational proof of concept for the hypothesis that slow rhythmic network activity may serve as internal reference frame for information coding in sensory cortices and they foster the notion that slow oscillations serve as key elements for the computations underlying perception.


Neural Networks | 2010

2010 Special Issue: Information-theoretic methods for studying population codes

Robin A. A. Ince; Riccardo Senatore; Ehsan Arabzadeh; Fernando Montani; Mathew E. Diamond; Stefano Panzeri

Population coding is the quantitative study of which algorithms or representations are used by the brain to combine together and evaluate the messages carried by different neurons. Here, we review an information-theoretic approach to population coding. We first discuss how to compute the information carried by simultaneously recorded neural populations, and in particular how to reduce the limited sampling bias which affects the calculation of information from a limited amount of experimental data. We then discuss how to quantify the contribution of individual members of the population, or the interaction between them, to the overall information encoded by the considered group of neurons. We focus in particular on evaluating what is the contribution of interactions up to any given order to the total information. We illustrate this formalism with applications to simulated data with realistic neuronal statistics and to real simultaneous recordings of multiple spike trains.


The Journal of Neuroscience | 2015

Irregular Speech Rate Dissociates Auditory Cortical Entrainment, Evoked Responses, and Frontal Alpha

Stephanie J. Kayser; Robin A. A. Ince; Joachim Gross; Christoph Kayser

The entrainment of slow rhythmic auditory cortical activity to the temporal regularities in speech is considered to be a central mechanism underlying auditory perception. Previous work has shown that entrainment is reduced when the quality of the acoustic input is degraded, but has also linked rhythmic activity at similar time scales to the encoding of temporal expectations. To understand these bottom-up and top-down contributions to rhythmic entrainment, we manipulated the temporal predictive structure of speech by parametrically altering the distribution of pauses between syllables or words, thereby rendering the local speech rate irregular while preserving intelligibility and the envelope fluctuations of the acoustic signal. Recording EEG activity in human participants, we found that this manipulation did not alter neural processes reflecting the encoding of individual sound transients, such as evoked potentials. However, the manipulation significantly reduced the fidelity of auditory delta (but not theta) band entrainment to the speech envelope. It also reduced left frontal alpha power and this alpha reduction was predictive of the reduced delta entrainment across participants. Our results show that rhythmic auditory entrainment in delta and theta bands reflect functionally distinct processes. Furthermore, they reveal that delta entrainment is under top-down control and likely reflects prefrontal processes that are sensitive to acoustical regularities rather than the bottom-up encoding of acoustic features. SIGNIFICANCE STATEMENT The entrainment of rhythmic auditory cortical activity to the speech envelope is considered to be critical for hearing. Previous work has proposed divergent views in which entrainment reflects either early evoked responses related to sound encoding or high-level processes related to expectation or cognitive selection. Using a manipulation of speech rate, we dissociated auditory entrainment at different time scales. Specifically, our results suggest that delta entrainment is controlled by frontal alpha mechanisms and thus support the notion that rhythmic auditory cortical entrainment is shaped by top-down mechanisms.


Frontiers in Neuroinformatics | 2009

Python for information theoretic analysis of neural data.

Robin A. A. Ince; Rasmus S. Petersen; Daniel C. Swan; Stefano Panzeri

Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources.


The Journal of Neuroscience | 2013

Neural Codes Formed by Small and Temporally Precise Populations in Auditory Cortex

Robin A. A. Ince; Stefano Panzeri; Christoph Kayser

The encoding of sensory information by populations of cortical neurons forms the basis for perception but remains poorly understood. To understand the constraints of cortical population coding we analyzed neural responses to natural sounds recorded in auditory cortex of primates (Macaca mulatta). We estimated stimulus information while varying the composition and size of the considered population. Consistent with previous reports we found that when choosing subpopulations randomly from the recorded ensemble, the average population information increases steadily with population size. This scaling was explained by a model assuming that each neuron carried equal amounts of information, and that any overlap between the information carried by each neuron arises purely from random sampling within the stimulus space. However, when studying subpopulations selected to optimize information for each given population size, the scaling of information was strikingly different: a small fraction of temporally precise cells carried the vast majority of information. This scaling could be explained by an extended model, assuming that the amount of information carried by individual neurons was highly nonuniform, with few neurons carrying large amounts of information. Importantly, these optimal populations can be determined by a single biophysical marker—the neurons encoding time scale—allowing their detection and readout within biologically realistic circuits. These results show that extrapolations of population information based on random ensembles may overestimate the population size required for stimulus encoding, and that sensory cortical circuits may process information using small but highly informative ensembles.


Philosophical Transactions of the Royal Society B | 2014

Reading spike timing without a clock: intrinsic decoding of spike trains.

Stefano Panzeri; Robin A. A. Ince; Matthew E. Diamond; Christoph Kayser

The precise timing of action potentials of sensory neurons relative to the time of stimulus presentation carries substantial sensory information that is lost or degraded when these responses are summed over longer time windows. However, it is unclear whether and how downstream networks can access information in precise time-varying neural responses. Here, we review approaches to test the hypothesis that the activity of neural populations provides the temporal reference frames needed to decode temporal spike patterns. These approaches are based on comparing the single-trial stimulus discriminability obtained from neural codes defined with respect to network-intrinsic reference frames to the discriminability obtained from codes defined relative to the experimenters computer clock. Application of this formalism to auditory, visual and somatosensory data shows that information carried by millisecond-scale spike times can be decoded robustly even with little or no independent external knowledge of stimulus time. In cortex, key components of such intrinsic temporal reference frames include dedicated neural populations that signal stimulus onset with reliable and precise latencies, and low-frequency oscillations that can serve as reference for partitioning extended neuronal responses into informative spike patterns.


Journal of Vision | 2014

Eye coding mechanisms in early human face event-related potentials.

Guillaume A. Rousselet; Robin A. A. Ince; Nicola J. van Rijsbergen; Philippe G. Schyns

In humans, the N170 event-related potential (ERP) is an integrated measure of cortical activity that varies in amplitude and latency across trials. Researchers often conjecture that N170 variations reflect cortical mechanisms of stimulus coding for recognition. Here, to settle the conjecture and understand cortical information processing mechanisms, we unraveled the coding function of N170 latency and amplitude variations in possibly the simplest socially important natural visual task: face detection. On each experimental trial, 16 observers saw face and noise pictures sparsely sampled with small Gaussian apertures. Reverse-correlation methods coupled with information theory revealed that the presence of the eye specifically covaries with behavioral and neural measurements: the left eye strongly modulates reaction times and lateral electrodes represent mainly the presence of the contralateral eye during the rising part of the N170, with maximum sensitivity before the N170 peak. Furthermore, single-trial N170 latencies code more about the presence of the contralateral eye than N170 amplitudes and early latencies are associated with faster reaction times. The absence of these effects in control images that did not contain a face refutes alternative accounts based on retinal biases or allocation of attention to the eye location on the face. We conclude that the rising part of the N170, roughly 120-170 ms post-stimulus, is a critical time-window in human face processing mechanisms, reflecting predominantly, in a face detection task, the encoding of a single feature: the contralateral eye.


Human Brain Mapping | 2017

A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula

Robin A. A. Ince; Bruno L. Giordano; Christoph Kayser; Guillaume A. Rousselet; Joachim Gross; Philippe G. Schyns

We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open‐source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541–1573, 2017.

Collaboration


Dive into the Robin A. A. Ince's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Stefano Panzeri

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mathew E. Diamond

International School for Advanced Studies

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rasmus S. Petersen

International School for Advanced Studies

View shared research outputs
Researchain Logo
Decentralizing Knowledge