Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anne Kösem is active.

Publication


Featured researches published by Anne Kösem.


PLOS ONE | 2012

Temporal structure in audiovisual sensory selection.

Anne Kösem; Virginie van Wassenhove

In natural environments, sensory information is embedded in temporally contiguous streams of events. This is typically the case when seeing and listening to a speaker or when engaged in scene analysis. In such contexts, two mechanisms are needed to single out and build a reliable representation of an event (or object): the temporal parsing of information and the selection of relevant information in the stream. It has previously been shown that rhythmic events naturally build temporal expectations that improve sensory processing at predictable points in time. Here, we asked to which extent temporal regularities can improve the detection and identification of events across sensory modalities. To do so, we used a dynamic visual conjunction search task accompanied by auditory cues synchronized or not with the color change of the target (horizontal or vertical bar). Sounds synchronized with the visual target improved search efficiency for temporal rates below 1.4 Hz but did not affect efficiency above that stimulation rate. Desynchronized auditory cues consistently impaired visual search below 3.3 Hz. Our results are interpreted in the context of the Dynamic Attending Theory: specifically, we suggest that a cognitive operation structures events in time irrespective of the sensory modality of input. Our results further support and specify recent neurophysiological findings by showing strong temporal selectivity for audiovisual integration in the auditory-driven improvement of visual search efficiency.


Language, cognition and neuroscience | 2017

Distinct contributions of low- and high-frequency neural oscillations to speech comprehension

Anne Kösem; Virginie van Wassenhove

ABSTRACT In the last decade, the involvement of neural oscillatory mechanisms in speech comprehension has been increasingly investigated. Current evidence suggests that low-frequency and high-frequency neural entrainment to the acoustic dynamics of speech are linked to its analysis. One crucial question is whether acoustical processing primarily modulates neural entrainment, or whether entrainment instead reflects linguistic processing. Here, we review studies investigating the effect of linguistic manipulations on neural oscillatory activity. In light of the current findings, we argue that theta (3–8 Hz) entrainment may primarily reflect the analysis of the acoustic features of speech. In contrast, recent evidence suggests that delta (1–3 Hz) and high-frequency activity (>40 Hz) are reliable indicators of perceived linguistic representations. The interdependence between low-frequency and high-frequency neural oscillations, as well as their causal role on speech comprehension, is further discussed with regard to neurophysiological models of speech processing.


Journal of Neurophysiology | 2016

High frequency neural activity predicts word parsing in ambiguous speech streams

Anne Kösem; Anahita Basirat; Leila Azizi; Virginie van Wassenhove

During speech listening, the brain parses a continuous acoustic stream of information into computational units (e.g., syllables or words) necessary for speech comprehension. Recent neuroscientific hypotheses have proposed that neural oscillations contribute to speech parsing, but whether they do so on the basis of acoustic cues (bottom-up acoustic parsing) or as a function of available linguistic representations (top-down linguistic parsing) is unknown. In this magnetoencephalography study, we contrasted acoustic and linguistic parsing using bistable speech sequences. While listening to the speech sequences, participants were asked to maintain one of the two possible speech percepts through volitional control. We predicted that the tracking of speech dynamics by neural oscillations would not only follow the acoustic properties but also shift in time according to the participants conscious speech percept. Our results show that the latency of high-frequency activity (specifically, beta and gamma bands) varied as a function of the perceptual report. In contrast, the phase of low-frequency oscillations was not strongly affected by top-down control. Whereas changes in low-frequency neural oscillations were compatible with the encoding of prelexical segmentation cues, high-frequency activity specifically informed on an individuals conscious speech percept.


Multisensory Research | 2013

Phase encoding of perceived events timing

Anne Kösem; Virginie van Wassenhove

Temporal coincidence is a crucial feature for audiovisual (AV) integration, and recent neurophysiological findings provide evidence for the role of oscillations in the temporal binding of information. Specifically, low-frequency oscillatory phase alignment between visual and auditory cortices has been proposed to support AV integration in time (Lakatos et al., 2008; Schroeder and Lakatos, 2009). Here we hypothesized that if phase coincidence was critical for perceiving synchrony between auditory and visual events, shifts in the phase should be commensurate with the subjective perception of AV events. To test this, participants underwent AV temporal recalibration (Fujisaki et al., 2004) while being recorded with magnetoencephalography (MEG). Temporal recalibration consisted in adapting subjects to desynchronized AV events in order to induce perceptual shifts of AV temporal order. Each recalibration period was followed by an assessment of participant’s temporal order threshold. Stimuli during recalibration were presented at an average rate of 1 Hz leading to a prominent tagging in both auditory and visual cortices. Analyses were carried out in source space (dSPM, cortically constrained source orientations) and restricted to sensory cortices. We show that the variability in perceived temporal order is captured by non-stationary 1 Hz entrainment: the preferential phase at this frequency was significantly shifted between the beginning and the end of the recalibration period in sensory cortices. Individuals’ perceived simultaneity could be accounted for by systematic shifts in the phase of auditory but not visual neural responses. All together, our results provide a novel physiological index for subjective simultaneity suggesting that auditory cortex recalibrates its timing to the visual spatial anchor.


PLOS ONE | 2015

Hysteresis in Audiovisual Synchrony Perception

Jean-Rémy Martin; Anne Kösem; Virginie van Wassenhove

The effect of stimulation history on the perception of a current event can yield two opposite effects, namely: adaptation or hysteresis. The perception of the current event thus goes in the opposite or in the same direction as prior stimulation, respectively. In audiovisual (AV) synchrony perception, adaptation effects have primarily been reported. Here, we tested if perceptual hysteresis could also be observed over adaptation in AV timing perception by varying different experimental conditions. Participants were asked to judge the synchrony of the last (test) stimulus of an AV sequence with either constant or gradually changing AV intervals (constant and dynamic condition, respectively). The onset timing of the test stimulus could be cued or not (prospective vs. retrospective condition, respectively). We observed hysteretic effects for AV synchrony judgments in the retrospective condition that were independent of the constant or dynamic nature of the adapted stimuli; these effects disappeared in the prospective condition. The present findings suggest that knowing when to estimate a stimulus property has a crucial impact on perceptual simultaneity judgments. Our results extend beyond AV timing perception, and have strong implications regarding the comparative study of hysteresis and adaptation phenomena.


Current Biology | 2018

Neural Entrainment Determines the Words We Hear

Anne Kösem; Hans R. Bosker; Atsuko Takashima; Antje S. Meyer; Ole Jensen; Peter Hagoort

Low-frequency neural entrainment to rhythmic input has been hypothesized as a canonical mechanism that shapes sensory perception in time. Neural entrainment is deemed particularly relevant for speech analysis, as it would contribute to the extraction of discrete linguistic elements from continuous acoustic signals. However, its causal influence in speech perception has been difficult to establish. Here, we provide evidence that oscillations build temporal predictions about the duration of speech tokens that affect perception. Using magnetoencephalography (MEG), we studied neural dynamics during listening to sentences that changed in speech rate. We observed neural entrainment to preceding speech rhythms persisting for several cycles after the change in rate. The sustained entrainment was associated with changes in the perceived duration of the last words vowel, resulting in the perception of words with different meanings. These findings support oscillatory models of speech processing, suggesting that neural oscillations actively shape speech perception.


Multisensory Research | 2013

When the brain fails to recalibrate audiovisual simultaneity: Hysteresis in synchrony perception

Anne Kösem; Virginie van Wassenhove; Jean-Rémy Martin

To accurately perceive the synchrony of visual and auditory events, the brain has to compensate for the speed differences between light and sound. This compensation may be accomplished by the mechanism of temporal recalibration (Fujisaki et al., 2004; Vroomen et al., 2004) a mechanism through which the subjects become more tolerant to constant audiovisual (AV) asynchrony by virtue of adaptation. However, a constant AV lag means that both the source and the observer are fixed in space; in an ecological setting, if a source is moving backward or away from the observer it will induce a progressive change in the AV delay. The present study shows that synchrony perception of slowly synchronizing AV stimuli is driven by a persistence effect or hysteresis (Hock et al., 1997). Specifically, when AV stimuli are progressively synchronizing, participants do not compensate for the initial asynchrony but rather persist in perceiving asynchrony; surprisingly however, slowly desynchronizing stimuli do not alter the perception of synchrony. Our results show that synchrony perception strongly depends on the dynamics of past AV stimulation. Whereas temporal recalibration effects suggest that the brain has a natural tendency to compensate for AV time lags emitted from a distant stationary source, our results suggest that compensation of AV lags for a moving source may not occur and perceived asynchrony persists.


NeuroImage | 2014

Encoding of event timing in the phase of neural oscillations

Anne Kösem; Alexandre Gramfort; Virginie van Wassenhove


conference of the international speech communication association | 2017

An entrained rhythm's frequency, not phase, influences temporal sampling of speech

Hans R. Bosker; Anne Kösem


Journal of Cognitive Neuroscience | 2017

Prestimulus Alpha Oscillations and the Temporal Sequencing of Audiovisual Events

Laetitia Grabot; Anne Kösem; Leila Azizi; Virginie van Wassenhove

Collaboration


Dive into the Anne Kösem's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ole Jensen

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ole Jensen

University of Birmingham

View shared research outputs
Researchain Logo
Decentralizing Knowledge