Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Luc H. Arnal is active.

Publication


Featured researches published by Luc H. Arnal.


Nature Neuroscience | 2011

Transitions in neural oscillations reflect prediction errors generated in audiovisual speech

Luc H. Arnal; Valentin Wyart; Anne-Lise Giraud

According to the predictive coding theory, top-down predictions are conveyed by backward connections and prediction errors are propagated forward across the cortical hierarchy. Using MEG in humans, we show that violating multisensory predictions causes a fundamental and qualitative change in both the frequency and spatial distribution of cortical activity. When visual speech input correctly predicted auditory speech signals, a slow delta regime (3–4 Hz) developed in higher-order speech areas. In contrast, when auditory signals invalidated predictions inferred from vision, a low-beta (14–15 Hz) / high-gamma (60–80 Hz) coupling regime appeared locally in a multisensory area (area STS). This frequency shift in oscillatory responses scaled with the degree of audio-visual congruence and was accompanied by increased gamma activity in lower sensory regions. These findings are consistent with the notion that bottom-up prediction errors are communicated in predominantly high (gamma) frequency ranges, whereas top-down predictions are mediated by slower (beta) frequencies.


The Journal of Neuroscience | 2009

Dual Neural Routing of Visual Facilitation in Speech Processing

Luc H. Arnal; Benjamin Morillon; Christian A. Kell; Anne-Lise Giraud

Viewing our interlocutor facilitates speech perception, unlike for instance when we telephone. Several neural routes and mechanisms could account for this phenomenon. Using magnetoencephalography, we show that when seeing the interlocutor, latencies of auditory responses (M100) are the shorter the more predictable speech is from visual input, whether the auditory signal was congruent or not. Incongruence of auditory and visual input affected auditory responses ∼20 ms after latency shortening was detected, indicating that initial content-dependent auditory facilitation by vision is followed by a feedback signal that reflects the error between expected and received auditory input (prediction error). We then used functional magnetic resonance imaging and confirmed that distinct routes of visual information to auditory processing underlie these two functional mechanisms. Functional connectivity between visual motion and auditory areas depended on the degree of visual predictability, whereas connectivity between the superior temporal sulcus and both auditory and visual motion areas was driven by audiovisual (AV) incongruence. These results establish two distinct mechanisms by which the brain uses potentially predictive visual information to improve auditory perception. A fast direct corticocortical pathway conveys visual motion parameters to auditory cortex, and a slower and indirect feedback pathway signals the error between visual prediction and auditory input.


Frontiers in Human Neuroscience | 2012

Predicting “When” Using the Motor System’s Beta-Band Oscillations

Luc H. Arnal

Anticipating future sensory events is one keystone of adaptive behavior. This notion is at the origin of recent theories suggesting perception and action control rely on internal models that are constantly tested and updated as a function of incoming sensory inputs. These hierarchical models (predictive coding and other generative models based on the notion of inference) suggest that neural responses reflect the difference between top-down expectations (or priors) and incoming feed-forward sensory inputs (Rao and Ballard, 1999; Friston, 2005). Priors are formed via the extraction of statistical regularities and can therefore relate to many different dimensions of a sensory event (e.g., spatial, spectral, temporal…; Arnal and Giraud, 2012). In the temporal domain, isochronous rhythm arguably constitutes the most basic regularity that can be used to anticipate the occurrence of an event. Consistent with such a notion of predictive timing, temporally anticipating a sound in an isochronous stream reduces the uncertainty about its occurrence (Rohenkohl et al., 2012) and therefore leads to a decrease of the amplitude of electrophysiological responses in the auditory cortex (Costa-Faidella et al., 2011). The neural origin and computations supporting predictive timing remain unclear, but a new study by Fujioka et al. (2012) raises a potential function of sensorimotor beta-band oscillations in the control of temporal anticipation during beat perception. The authors recorded neuromagnetic activity in participants that passively listened to isochronous (2.5, 1.7, and 1.3 Hz) or anisochronous (randomly spaced) tone sequences presented in four different blocks. Importantly, subjects’ attention was engaged in viewing a silent movie, and they were explicitly asked to ignore the sounds. Using a dipole source model of auditory responses, the authors first focused on the time-frequency profiles in response to these sequences in the auditory cortex. Consistent with previous findings, they observed transient increases of low ( 30 Hz) power, time-locked to the stimulus. More interestingly, while beta-band activity (13–25 Hz) consistently decreased after stimulus onset in every condition, the following beta rebound (beta resynchronization) increased differentially as a function of beat patterns. By evaluating the rising slope of the beta rebound for each condition, they found that beta rebound is fast and transient during aperiodic stimulation whereas it increased progressively to reach a maximum at the occurrence of the subsequent sound in the isochronous conditions. They also observed that the magnitude of the post-stimulus beta decrease co-varies with the frequency of stimulation (i.e., the slower the rhythm, the larger the beta decrease). Based on these findings, the authors suggest that beta rebound tracks the tempo of stimulation in the auditory cortex and can be used to maintain predictive timing. Because beta-band activity is classically considered as being related to motor functions (Engel and Fries, 2010), the authors extended their investigation to whole-brain beta activity using a spatio-temporal principal component analysis. In addition to auditory regions, they identified a large network of sensorimotor regions implicated in the tracking of beat tempo. This suggests that the motor system is recruited during the passive perception of rhythms, even in the absence of any intention to move in synchrony with the beat. One might interpret this result as the passive, rhythmic entrainment of auditory, and motor systems. In that case, though, it would be unlikely to observe tempo-dependent beta rebounds following post-stimulus beta suppression. The fact that the beta rebound progressively increases and is maximal at the onset of upcoming sounds more likely supports an active, predictive timing account. These results suggest that (i) the motor system is automatically recruited during passive listening to anticipate forthcoming sounds, (ii) predictive timing allows to control neural activity in sensory regions, and (iii) beta-band oscillations play an instrumental role in predictive timing. By examining in more detail the interactions between auditory and motor systems, the authors determined that post-stimulus event-related beta-coherence varied in an opposite way between these two systems. While beta-coherence decreased after stimulus onset in auditory regions, it simultaneously increased in motor areas. This may suggest that beta oscillations are used to control predictive timing via sensorimotor loops between auditory and motor systems. This interpretation must be viewed with caution in light of the absence of a clear causal relationship between the time courses of beta activity in these regions. However, these results converge to support a functional role of beta activity in the predictive modulation of auditory activity by the motor system.


Frontiers in Psychology | 2012

Asymmetric Function of Theta and Gamma Activity in Syllable Processing: An Intra-Cortical Study

Benjamin Morillon; Catherine Liégeois-Chauvel; Luc H. Arnal; Christian G. Bénar; Anne-Lise Giraud

Low-gamma (25–45 Hz) and theta (4–8 Hz) oscillations are proposed to underpin the integration of phonemic and syllabic information, respectively. How these two scales of analysis split functions across hemispheres is unclear. We analyzed cortical responses from an epileptic patient with a rare bilateral electrode implantation (stereotactic EEG) in primary (A1/BA41 and A2/BA42) and association auditory cortices (BA22). Using time-frequency analyses, we confirmed the dominance of a 5–6 Hz theta activity in right and of a low-gamma (25–45 Hz) activity in left primary auditory cortices (A1/A2), during both resting state and syllable processing. We further detected high-theta (7–8 Hz) resting activity in left primary, but also associative auditory regions. In left BA22, its phase correlated with high-gamma induced power. Such a hierarchical relationship across theta and gamma frequency bands (theta/gamma phase-amplitude coupling) could index the process by which the neural code shifts from stimulus feature- to phonological-encoding, and is associated with the transition from evoked to induced power responses. These data suggest that theta and gamma activity in right and left auditory cortices bear different functions. They support a scheme where slow parsing of the acoustic information dominates in right hemisphere at a syllabic (5–6 Hz) rate, and left auditory cortex exhibits a more complex cascade of oscillations, reflecting the possible extraction of transient acoustic cues at a fast (~25–45 Hz) rate, subsequently integrated at a slower, e.g., syllabic one. Slow oscillations could functionally participate to speech processing by structuring gamma activity in left BA22, where abstract percepts emerge.


The Journal of Neuroscience | 2016

Temporal Prediction in lieu of Periodic Stimulation

Benjamin Morillon; Charles E. Schroeder; Valentin Wyart; Luc H. Arnal

Predicting not only what will happen, but also when it will happen is extremely helpful for optimizing perception and action. Temporal predictions driven by periodic stimulation increase perceptual sensitivity and reduce response latencies. At the neurophysiological level, a single mechanism has been proposed to mediate this twofold behavioral improvement: the rhythmic entrainment of slow cortical oscillations to the stimulation rate. However, temporal regularities can occur in aperiodic contexts, suggesting that temporal predictions per se may be dissociable from entrainment to periodic sensory streams. We investigated this possibility in two behavioral experiments, asking human participants to detect near-threshold auditory tones embedded in streams whose temporal and spectral properties were manipulated. While our findings confirm that periodic stimulation reduces response latencies, in agreement with the hypothesis of a stimulus-driven entrainment of neural excitability, they further reveal that this motor facilitation can be dissociated from the enhancement of auditory sensitivity. Perceptual sensitivity improvement is unaffected by the nature of temporal regularities (periodic vs aperiodic), but contingent on the co-occurrence of a fulfilled spectral prediction. Altogether, the dissociation between predictability and periodicity demonstrates that distinct mechanisms flexibly and synergistically operate to facilitate perception and action. SIGNIFICANCE STATEMENT Temporal predictions are increasingly recognized as fundamental instruments for optimizing performance, enabling mammals to exploit regularities in the world. However, the notion of temporal predictions is often confounded with the idea of entrainment to periodic sensory inputs. At the behavioral level, it is also unclear whether perceptual sensitivity and reaction time improvements benefit the same way from temporal predictions and periodic stimulation. In two behavioral experiments on human participants, we find that periodic stimulation facilitates response readiness, whereas temporal predictions improve the precision of auditory processing. This latter effect arises regardless of the nature of temporal regularities (periodic vs aperiodic), but depends on the co-occurrence of a fulfilled spectral prediction.


Handbook of Clinical Neurology | 2015

Temporal coding in the auditory cortex

Luc H. Arnal; David Poeppel; Anne-Lise Giraud

Speech is a complex acoustic signal showing a quasiperiodic structure at several timescales. Integrated neural signals recorded in the cortex also show periodicity at different timescales. In this chapter we outline the neural mechanisms that potentially allow the auditory cortex to segment and encode continuous speech. This chapter focuses on how the human auditory cortex uses the temporal structure of the acoustic signal to extract phonemes and syllables, the two major constituents of connected speech. We argue that the quasiperiodic structure of collective neural activity in auditory cortex represents the ideal mechanical infrastructure to fractionate continuous speech into linguistic constituents of variable sizes.


The Journal of Neuroscience | 2017

θ-Band and β-Band Neural Activity Reflects Independent Syllable Tracking and Comprehension of Time-Compressed Speech

Maria Pefkou; Luc H. Arnal; Lorenzo Fontolan; Anne-Lise Giraud

Recent psychophysics data suggest that speech perception is not limited by the capacity of the auditory system to encode fast acoustic variations through neural γ activity, but rather by the time given to the brain to decode them. Whether the decoding process is bounded by the capacity of θ rhythm to follow syllabic rhythms in speech, or constrained by a more endogenous top-down mechanism, e.g., involving β activity, is unknown. We addressed the dynamics of auditory decoding in speech comprehension by challenging syllable tracking and speech decoding using comprehensible and incomprehensible time-compressed auditory sentences. We recorded EEGs in human participants and found that neural activity in both θ and γ ranges was sensitive to syllabic rate. Phase patterns of slow neural activity consistently followed the syllabic rate (4–14 Hz), even when this rate went beyond the classical θ range (4–8 Hz). The power of θ activity increased linearly with syllabic rate but showed no sensitivity to comprehension. Conversely, the power of β (14–21 Hz) activity was insensitive to the syllabic rate, yet reflected comprehension on a single-trial basis. We found different long-range dynamics for θ and β activity, with β activity building up in time while more contextual information becomes available. This is consistent with the roles of θ and β activity in stimulus-driven versus endogenous mechanisms. These data show that speech comprehension is constrained by concurrent stimulus-driven θ and low-γ activity, and by endogenous β activity, but not primarily by the capacity of θ activity to track the syllabic rhythm. SIGNIFICANCE STATEMENT Speech comprehension partly depends on the ability of the auditory cortex to track syllable boundaries with θ-range neural oscillations. The reason comprehension drops when speech is accelerated could hence be because θ oscillations can no longer follow the syllabic rate. Here, we presented subjects with comprehensible and incomprehensible accelerated speech, and show that neural phase patterns in the θ band consistently reflect the syllabic rate, even when speech becomes too fast to be intelligible. The drop in comprehension, however, is signaled by a significant decrease in the power of low-β oscillations (14–21 Hz). These data suggest that speech comprehension is not limited by the capacity of θ oscillations to adapt to syllabic rate, but by an endogenous decoding process.


Physiology & Behavior | 2018

Explaining individual variation in paternal brain responses to infant cries

Ting Li; Marilyn Horta; Jennifer S. Mascaro; Kelly R. Bijanki; Luc H. Arnal; Melissa C. Adams; Ronald G. Barr; James K. Rilling

Crying is the principal means by which newborn infants shape parental behavior to meet their needs. While this mechanism can be highly effective, infant crying can also be an aversive stimulus that leads to parental frustration and even abuse. Fathers have recently become more involved in direct caregiving activities in modern, developed nations, and fathers are more likely than mothers to physically abuse infants. In this study, we attempt to explain variation in the neural response to infant crying among human fathers, with the hope of identifying factors that are associated with a more or less sensitive response. We imaged brain function in 39 first-time fathers of newborn infants as they listened to both their own and a standardized unknown infant cry stimulus, as well as auditory control stimuli, and evaluated whether these neural responses were correlated with measured characteristics of fathers and infants that were hypothesized to modulate these responses. Fathers also provided subjective ratings of each cry stimulus on multiple dimensions. Fathers showed widespread activation to both own and unknown infant cries in neural systems involved in empathy and approach motivation. There was no significant difference in the neural response to the own vs. unknown infant cry, and many fathers were unable to distinguish between the two cries. Comparison of these results with previous studies in mothers revealed a high degree of similarity between first-time fathers and first-time mothers in the pattern of neural activation to newborn infant cries. Further comparisons suggested that younger infant age was associated with stronger paternal neural responses, perhaps due to hormonal or novelty effects. In our sample, older fathers found infant cries less aversive and had an attenuated response to infant crying in both the dorsal anterior cingulate cortex (dACC) and the anterior insula, suggesting that compared with younger fathers, older fathers may be better able to avoid the distress associated with empathic over-arousal in response to infant cries. A principal components analysis revealed that fathers with more negative emotional reactions to the unknown infant cry showed decreased activation in the thalamus and caudate nucleus, regions expected to promote positive parental behaviors, as well as increased activation in the hypothalamus and dorsal ACC, again suggesting that empathic over-arousal might result in negative emotional reactions to infant crying. In sum, our findings suggest that infant age, paternal age and paternal emotional reactions to infant crying all modulate the neural response of fathers to infant crying. By identifying neural correlates of variation in paternal subjective reactions to infant crying, these findings help lay the groundwork for evaluating the effectiveness of interventions designed to increase paternal sensitivity and compassion.


Neurobiology of Language | 2016

A Neurophysiological Perspective on Speech Processing in “The Neurobiology of Language”

Luc H. Arnal; David Poeppel; Anne-Lise Giraud

Speech is the dominant communication means in humans, and it is perhaps the most complex stimulus that the human brain is exposed to everyday. Although the processing of complex spectrotemporal acoustic by the auditory system signals is fairly well-understood, the specifics of speech decoding remain quite enigmatic. This chapter builds on the idea that the human auditory cortex is ideally suited to extract the temporal structure of speech main acoustic units (i.e., the phonemes and syllables). We bring together different models grounded on the assumption that the quasiperiodic temporal structure of collective neural activity in auditory cortex represents the ideal mechanical infrastructure to solve the speech demultiplexing problem (i.e., the fractioning of connected speech into linguistic constituents of variable size).


Trends in Cognitive Sciences | 2012

Cortical oscillations and sensory predictions

Luc H. Arnal; Anne-Lise Giraud

Collaboration


Dive into the Luc H. Arnal's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Benjamin Morillon

Columbia University Medical Center

View shared research outputs
Top Co-Authors

Avatar

Christian A. Kell

Goethe University Frankfurt

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Valentin Wyart

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge