Anne Keitel
University of Glasgow
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Anne Keitel.
Journal of Experimental Child Psychology | 2013
Anne Keitel; Wolfgang Prinz; Angela D. Friederici; Claes von Hofsten; Moritz M. Daum
In conversations, adults readily detect and anticipate the end of a speakers turn. However, little is known about the development of this ability. We addressed two important aspects involved in the perception of conversational turn taking: semantic content and intonational form. The influence of semantics was investigated by testing prelinguistic and linguistic children. The influence of intonation was tested by presenting participants with videos of two dyadic conversations: one with normal intonation and one with flattened (removed) intonation. Children of four different age groups--two prelinguistic groups (6- and 12-month-olds) and two linguistic groups (24- and 36-month-olds)--and an adult group participated. Their eye movements were recorded, and the frequency of anticipated turns was analyzed. Our results show that (a) the anticipation of turns was reliable only in 3-year-olds and adults, with younger children shifting their gaze between speakers regardless of the turn taking, and (b) only 3-year-olds anticipated turns better if intonation was normal. These results indicate that children anticipate turns in conversations in a manner comparable (but not identical) to adults only after they have developed a sophisticated understanding of language. In contrast to adults, 3-year-olds rely more strongly on prosodic information during the perception of conversational turn taking.
PLOS Biology | 2016
Anne Keitel; Joachim Gross
The human brain can be parcellated into diverse anatomical areas. We investigated whether rhythmic brain activity in these areas is characteristic and can be used for automatic classification. To this end, resting-state MEG data of 22 healthy adults was analysed. Power spectra of 1-s long data segments for atlas-defined brain areas were clustered into spectral profiles (“fingerprints”), using k-means and Gaussian mixture (GM) modelling. We demonstrate that individual areas can be identified from these spectral profiles with high accuracy. Our results suggest that each brain area engages in different spectral modes that are characteristic for individual areas. Clustering of brain areas according to similarity of spectral profiles reveals well-known brain networks. Furthermore, we demonstrate task-specific modulations of auditory spectral profiles during auditory processing. These findings have important implications for the classification of regional spectral activity and allow for novel approaches in neuroimaging and neurostimulation in health and disease.
NeuroImage | 2017
Anne Keitel; Robin A. A. Ince; Joachim Gross; Christoph Kayser
ABSTRACT The timing of slow auditory cortical activity aligns to the rhythmic fluctuations in speech. This entrainment is considered to be a marker of the prosodic and syllabic encoding of speech, and has been shown to correlate with intelligibility. Yet, whether and how auditory cortical entrainment is influenced by the activity in other speech–relevant areas remains unknown. Using source‐localized MEG data, we quantified the dependency of auditory entrainment on the state of oscillatory activity in fronto‐parietal regions. We found that delta band entrainment interacted with the oscillatory activity in three distinct networks. First, entrainment in the left anterior superior temporal gyrus (STG) was modulated by beta power in orbitofrontal areas, possibly reflecting predictive top‐down modulations of auditory encoding. Second, entrainment in the left Heschls Gyrus and anterior STG was dependent on alpha power in central areas, in line with the importance of motor structures for phonological analysis. And third, entrainment in the right posterior STG modulated theta power in parietal areas, consistent with the engagement of semantic memory. These results illustrate the topographical network interactions of auditory delta entrainment and reveal distinct cross‐frequency mechanisms by which entrainment can interact with different cognitive processes underlying speech perception. HIGHLIGHTSWe study auditory cortical speech entrainment from a network perspective.Found three distinct networks interacting with delta‐entrainment in auditory cortex.Entrainment is modulated by frontal beta power, possibly indexing predictions.Central alpha power interacts with entrainment, suggesting motor involvement.Parietal theta is modulated by entrainment, suggesting working memory compensation.
Frontiers in Psychology | 2015
Anne Keitel; Moritz M. Daum
The anticipation of a speaker’s next turn is a key element of successful conversation. This can be achieved using a multitude of cues. In natural conversation, the most important cue for adults to anticipate the end of a turn (and therefore the beginning of the next turn) is the semantic and syntactic content. In addition, prosodic cues, such as intonation, or visual signals that occur before a speaker starts speaking (e.g., opening the mouth) help to identify the beginning and the end of a speaker’s turn. Early in life, prosodic cues seem to be more important than in adulthood. For example, it was previously shown that 3-year-old children anticipated more turns in observed conversations when intonation was available compared with when not, and this beneficial effect was present neither in younger children nor in adults (Keitel et al., 2013). In the present study, we investigated this effect in greater detail. Videos of conversations between puppets with either normal or flattened intonation were presented to children (1-year-olds and 3-year-olds) and adults. The use of puppets allowed the control of visual signals: the verbal signals (speech) started exactly at the same time as the visual signals (mouth opening). With respect to the children, our findings replicate the results of the previous study: 3-year-olds anticipated more turns with normal intonation than with flattened intonation, whereas 1-year-olds did not show this effect. In contrast to our previous findings, the adults showed the same intonation effect as the 3-year-olds. This suggests that adults’ cue use varies depending on the characteristics of a conversation. Our results further support the notion that the cues used to anticipate conversational turns differ in development.
Brain and behavior | 2013
Silke Atmaca; Waltraud Stadler; Anne Keitel; Derek V. M. Ott; Jöran Lepsien; Wolfgang Prinz
The multiple object tracking (MOT) paradigm is a cognitive task that requires parallel tracking of several identical, moving objects following nongoal‐directed, arbitrary motion trajectories.
PLOS Biology | 2018
Anne Keitel; Joachim Gross; Christoph Kayser
During online speech processing, our brain tracks the acoustic fluctuations in speech at different timescales. Previous research has focused on generic timescales (for example, delta or theta bands) that are assumed to map onto linguistic features such as prosody or syllables. However, given the high intersubject variability in speaking patterns, such a generic association between the timescales of brain activity and speech properties can be ambiguous. Here, we analyse speech tracking in source-localised magnetoencephalographic data by directly focusing on timescales extracted from statistical regularities in our speech material. This revealed widespread significant tracking at the timescales of phrases (0.6–1.3 Hz), words (1.8–3 Hz), syllables (2.8–4.8 Hz), and phonemes (8–12.4 Hz). Importantly, when examining its perceptual relevance, we found stronger tracking for correctly comprehended trials in the left premotor (PM) cortex at the phrasal scale as well as in left middle temporal cortex at the word scale. Control analyses using generic bands confirmed that these effects were specific to the speech regularities in our stimuli. Furthermore, we found that the phase at the phrasal timescale coupled to power at beta frequency (13–30 Hz) in motor areas. This cross-frequency coupling presumably reflects top-down temporal prediction in ongoing speech perception. Together, our results reveal specific functional and perceptually relevant roles of distinct tracking and cross-frequency processes along the auditory–motor pathway.
PLOS ONE | 2014
Anne Keitel; Wolfgang Prinz; Moritz M. Daum
Infants and adults frequently observe actions performed jointly by more than one person. Research in action perception, however, has focused largely on actions performed by an individual person. Here, we explore how 9- and 12-month-old infants and adults perceive a block-stacking action performed by either one agent (individual condition) or two agents (joint condition). We used eye tracking to measure the latency of participants’ gaze shifts towards action goals. Adults anticipated goals in both conditions significantly faster than infants, and their gaze latencies did not differ between conditions. By contrast, infants showed faster anticipation of goals in the individual condition than in the joint condition. This difference was more pronounced in 9-month-olds. Further analyses of fixations examined the role of visual attention in action perception. These findings are cautiously interpreted in terms of low-level processing in infants and higher-level processing in adults. More precisely, our results suggest that adults are able to infer the overarching joint goal of two agents, whereas infants are not yet able to do so and might rely primarily on visual cues to infer the respective sub-goals. In conclusion, our findings indicate that the perception of joint action in infants develops differentially from that of individual action.
Frontiers in Psychology | 2015
Anja Gampe; Anne Keitel; Moritz M. Daum
The development of action and perception, and their relation in infancy is a central research area in socio-cognitive sciences. In this Perspective Article, we focus on the developmental variability and continuity of action and perception. At group level, these skills have been shown to consistently improve with age. We would like to raise awareness for the issue that, at individual level, development might be subject to more variable changes. We present data from a longitudinal study on the perception and production of contralateral reaching skills of infants aged 7, 8, 9, and 12 months. Our findings suggest that individual development does not increase linearly for action or for perception, but instead changes dynamically. These non-continuous changes substantially affect the relation between action and perception at each measuring point and the respective direction of causality. This suggests that research on the development of action and perception and their interrelations needs to take into account individual variability and continuity more progressively.
Journal of Experimental Psychology: Learning, Memory and Cognition | 2018
Bo Yao; Anne Keitel; Gillian Bruce; Graham G. Scott; Patrick J. O'Donnell; Sara C. Sereno
Emotion (positive and negative) words are typically recognized faster than neutral words. Recent research suggests that emotional valence, while often treated as a unitary semantic property, may be differentially represented in concrete and abstract words. Studies that have explicitly examined the interaction of emotion and concreteness, however, have demonstrated inconsistent patterns of results. Moreover, these findings may be limited as certain key lexical variables (e.g., familiarity, age of acquisition) were not taken into account. We investigated the emotion-concreteness interaction in a large-scale, highly controlled lexical decision experiment. A 3 (Emotion: negative, neutral, positive) × 2 (Concreteness: abstract, concrete) design was used, with 45 items per condition and 127 participants. We found a significant interaction between emotion and concreteness. Although positive and negative valenced words were recognized faster than neutral words, this emotion advantage was significantly larger in concrete than in abstract words. We explored potential contributions of participant alexithymia level and item imageability to this interactive pattern. We found that only word imageability significantly modulated the emotion-concreteness interaction. While both concrete and abstract emotion words are advantageously processed relative to comparable neutral words, the mechanisms of this facilitation are paradoxically more dependent on imageability in abstract words.
bioRxiv | 2018
Christian Keitel; Anne Keitel; Christopher S.Y. Benwell; Christoph Daube; Gregor Thut; Joachim Gross
Two largely independent research lines use rhythmic sensory stimulation to study visual processing. Despite the use of strikingly similar experimental paradigms, they differ crucially in their notion of the stimulus-driven periodic brain responses: One regards them mostly as synchronised (entrained) intrinsic brain rhythms; the other assumes they are predominantly evoked responses (classically termed steady-state responses, or SSRs) that add to the ongoing brain activity. This conceptual difference can produce contradictory predictions about, and interpretations of, experimental outcomes. The effect of spatial attention on brain rhythms in the alpha-band (8 - 13 Hz) constitutes one such instance: alpha-range SSRs have typically been found to increase in power when participants focus their spatial attention on laterally presented stimuli, in line with a gain control of the visual evoked response. In nearly identical experiments, retinotopic decreases in entrained alphaband power have been reported, in line with the inhibitory function of intrinsic alpha. Here, we demonstrate that this apparent discrepancy results from a small but far-reaching difference in data analysis. By re-analysing EEG data recorded during bilateral rhythmic visual stimulation, we show that averaging across trials before versus after computing power spectra results in the typical patterns reported by SSR and entrainment studies, respectively, therefore reconciling the seemingly contradictory findings. The opposite sign of attention effects suggests that two separate neural processes might be at play simultaneously. SIGNIFICANCE STATEMENT The apparent “attentional modulation conundrum” regarding stimulus-driven ∼10 Hz brain rhythms stems from two different data analysis approaches. Either approach can be justified depending on the research question. However, only in employing both to the same dataset we found that attention may rely on two distinct cortical mechanisms - retinotopic alpha suppression and increased temporal tracking - to influence the processing of dynamic visual stimulation.