Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Silke Paulmann is active.

Publication


Featured researches published by Silke Paulmann.


Journal of Phonetics | 2009

Factors in the recognition of vocally expressed emotions : A comparison of four languages

Marc D. Pell; Silke Paulmann; Chinar Dara; Areej Alasseri; Sonja A. Kotz

To understand how language influences the vocal communication of emotion, we investigated how discrete emotions are recognized and acoustically differentiated in four language contexts—English, German, Hindi, and Arabic. Vocal expressions of six emotions (anger, disgust, fear, sadness, happiness, pleasant surprise) and neutral expressions were elicited from four native speakers of each language. Each speaker produced pseudo-utterances (“nonsense speech”) which resembled their native language to express each emotion type, and the recordings were judged for their perceived emotional meaning by a group of native listeners in each language condition. Emotion recognition and acoustic patterns were analyzed within and across languages. Although overall recognition rates varied by language, all emotions could be recognized strictly from vocal cues in each language at levels exceeding chance. Anger, sadness, and fear tended to be recognized most accurately irrespective of language. Acoustic and discriminant function analyses highlighted the importance of speaker fundamental frequency (i.e., relative pitch level and variability) for signalling vocal emotions in all languages. Our data emphasize that while emotional communication is governed by display rules and other social variables, vocal expressions of ‘basic’ emotion in speech exhibit modal tendencies in their acoustic and perceptual attributes which are largely unaffected by language or linguistic similarity.


Neuroreport | 2008

Early emotional prosody perception based on different speaker voices.

Silke Paulmann; Sonja A. Kotz

Decoding verbal and nonverbal emotional expressions is an important part of speech communication. Although various studies have tried to specify the brain regions that underlie different emotions conveyed in speech, few studies have aimed to specify the time course of emotional speech decoding. We used event-related potentials to determine when emotional speech is first differentiated from neutral speech. Participants engaged in an implicit emotional processing task (probe verification) while listening to emotional sentences spoken by a female and a male speaker. Independent of speaker voice, emotional sentences could be differentiated from neutral sentences as early as 200 ms after sentence onset (P200), suggesting rapid emotional decoding.


Journal of Cognitive Neuroscience | 2005

Who's in Control? Proficiency and L1 Influence on L2 Processing

Silke Paulmann; Soiija A. Kotz

We report three reaction time (RT)/event-related brain potential (ERP) semantic priming lexical decision experiments that explore the following in relation to L1 activation during L2 processing: (1) the role of L2 proficiency, (2) the role of sentence context, and (3) the locus of L1 activations (ortho-graphic vs. semantic). All experiments used German (L1) homonyms translated into English (L2) to form prime-target pairs (pine-jaw for Kiefer) to test whether the L1 caused interference in an all-L2 experiment. Both RTs and ERPs were measured on targets. Experiment 1 revealed reversed priming in the N200 component and RTs for low-proficiency learners, but only RT interference for high-proficiency participants. Experiment 2 showed that once the words were processed in sentence context, the low-proficiency participants still showed reversed N200 and RT priming, whereas the high-proficiency group showed no effects. Experiment 3 tested native English speakers with the words in sentence context and showed a null result comparable to the high-proficiency group. Based on these results, we argue that cognitive control relating to translational activation is modulated by (1) L2 proficiency, as the early interference in the N200 was observed only for low-proficiency learners, and (2) sentence context, as it helps high-proficiency learners control L1 activation. As reversed priming was observed in the N200 and not the N400 component, we argue that (3) the locus of the L1 activations was orthographic. Implications in terms of bilingual word recognition and the functional role of the N200 ERP component are discussed.


Progress in Brain Research | 2006

Lateralization of emotional prosody in the brain: an overview and synopsis on the impact of study design.

Sonja A. Kotz; Martin Meyer; Silke Paulmann

Recently, research on the lateralization of linguistic and nonlinguistic (emotional) prosody has experienced a revival. However, both neuroimaging and patient evidence do not draw a coherent picture substantiating right-hemispheric lateralization of prosody and emotional prosody in particular. The current overview summarizes positions and data on the lateralization of emotion and emotional prosodic processing in the brain and proposes that: (1) the realization of emotional prosodic processing in the brain is based on differentially lateralized subprocesses and (2) methodological factors can influence the lateralization of emotional prosody in neuroimaging investigations. Latter evidence reveals that emotional valence effects are strongly right lateralized in studies using compact blocked presentation of emotional stimuli. In contrast, data obtained from event-related studies are indicative of bilateral or left-accented lateralization of emotional prosodic valence. These findings suggest a strong interaction between language and emotional prosodic processing.


Brain and Language | 2008

An ERP investigation on the temporal dynamics of emotional prosody and emotional semantics in pseudo- and lexical-sentence context

Silke Paulmann; Sonja A. Kotz

Previous evidence supports differential event-related brain potential (ERP) responses for emotional prosodic processing and integrative emotional prosodic/semantic processing. While latter process elicits a negativity similar to the well-known N400 component, transitions in emotional prosodic processing elicit a positivity. To further substantiate this evidence, the current investigation utilized lexical-sentences and sentences without lexical content (pseudo-sentences) spoken in six basic emotions by a female and a male speaker. Results indicate that emotional prosodic expectancy violations elicit a right-lateralized positive-going ERP component independent of basic emotional prosodies and speaker voice. In addition, expectancy violations of integrative emotional prosody/semantics elicit a negativity with a whole-head distribution. The current results nicely complement previous evidence, and extend the results by showing the respective effects for a wider range of emotional prosodies independent of lexical content and speaker voice.


Language and Linguistics Compass | 2011

Emotion, Language and the Brain

Sonja A. Kotz; Silke Paulmann

In social interactions, humans can express how they feel in what (verbal) they say and how (non-verbal) they say it. Although decoding of vocal emotion expressions occurs rapidly, accumulating electrophysiological evidence suggests that this process is multilayered and involves temporally and functionally distinct processing steps. Neuroimaging and lesion data confirm that these processing steps, which support emotional speech and language comprehension, are anchored in a functionally differentiated brain network. The present review on emotional speech and language processing discusses concepts and empirical clinical and neuroscientific evidence on the basis of behavioral, event-related brain potential, and functional magnetic resonance imaging data. These data allow shaping our understanding of how we communicate emotions to others through speech and language. It leads to a multistep processing model of vocal and visual emotion expressions.


Cognitive, Affective, & Behavioral Neuroscience | 2010

Contextual influences of emotional speech prosody on face processing: How much is enough?

Silke Paulmann; Marc D. Pell

The influence of emotional prosody on the evaluation of emotional facial expressions was investigated in an event-related brain potential (ERP) study using a priming paradigm, the facial affective decision task. Emotional prosodic fragments of short (200-msec) and medium (400-msec) duration were presented as primes, followed by an emotionally related or unrelated facial expression (or facial grimace, which does not resemble an emotion). Participants judged whether or not the facial expression represented an emotion. ERP results revealed an N400-like differentiation for emotionally related prime-target pairs when compared with unrelated prime-target pairs. Faces preceded by prosodic primes of medium length led to a normal priming effect (larger negativity for unrelated than for related prime-target pairs), but the reverse ERP pattern (larger negativity for related than for unrelated prime-target pairs) was observed for faces preceded by short prosodic primes. These results demonstrate that brief exposure to prosodic cues can establish a meaningful emotional context that influences related facial processing; however, this context does not always lead to a processing advantage when prosodic information is very short in duration.


Social Neuroscience | 2010

Orbito-frontal lesions cause impairment during late but not early emotional prosodic processing

Silke Paulmann; Sebastian Seifert; Sonja A. Kotz

Abstract The orbitofrontal cortex (OFC) is functionally linked to a variety of cognitive and emotional functions. In particular, lesions of the human OFC lead to large-scale changes in social and emotional behavior. For example, patients with OFC lesions are reported to suffer from deficits in affective decision-making, including impaired emotional face and voice expression recognition (e.g., Hornak et al., 1996, 2003). However, previous studies have failed to acknowledge that emotional processing is a multistage process. Thus, different stages of emotional processing (e.g., early vs. late) in the same patient group could be affected in a qualitatively different manner. The present study investigated this possibility and tested implicit emotional speech processing in an ERP experiment followed by an explicit behavioral emotional recognition task. OFC patients listened to vocal emotional expressions of anger, fear, disgust, and happiness compared to a neutral baseline spoken either with or without lexical content. In line with previous evidence (Paulmann & Kotz, 2008b), both patients and healthy controls differentiate emotional and neutral prosody within 200 ms (P200). However, the recognition of emotional vocal expressions is impaired in OFC patients as compared to healthy controls. The current data serve as first evidence that emotional prosody processing is impaired only at a late, and not at an early processing stage in OFC patients.


Brain Research | 2008

Functional contributions of the basal ganglia to emotional prosody: Evidence from ERPs

Silke Paulmann; Marc D. Pell; Sonja A. Kotz

The basal ganglia (BG) have been functionally linked to emotional processing [Pell, M.D., Leonard, C.L., 2003. Processing emotional tone form speech in Parkinsons Disease: a role for the basal ganglia. Cogn. Affec. Behav. Neurosci. 3, 275-288; Pell, M.D., 2006. Cerebral mechanisms for understanding emotional prosody in speech. Brain Lang. 97 (2), 221-234]. However, few studies have tried to specify the precise role of the BG during emotional prosodic processing. Therefore, the current study examined deviance detection in healthy listeners and patients with left focal BG lesions during implicit emotional prosodic processing in an event-related brain potential (ERP)-experiment. In order to compare these ERP responses with explicit judgments of emotional prosody, the same participants were tested in a follow-up recognition task. As previously reported [Kotz, S.A., Paulmann, S., 2007. When emotional prosody and semantics dance cheek to cheek: ERP evidence. Brain Res. 1151, 107-118; Paulmann, S. & Kotz, S.A., 2008. An ERP investigation on the temporal dynamics of emotional prosody and emotional semantics in pseudo- and lexical sentence context. Brain Lang. 105, 59-69], deviance of prosodic expectancy elicits a right lateralized positive ERP component in healthy listeners. Here we report a similar positive ERP correlate in BG-patients and healthy controls. In contrast, BG-patients are significantly impaired in explicit recognition of emotional prosody when compared to healthy controls. The current data serve as first evidence that focal lesions in left BG do not necessarily affect implicit emotional prosodic processing but evaluative emotional prosodic processes as demonstrated in the recognition task. The results suggest that the BG may not play a mandatory role in implicit emotional prosodic processing. Rather, executive processes underlying the recognition task may be dysfunctional during emotional prosodic processing.


Frontiers in Psychology | 2013

Valence, arousal, and task effects in emotional prosody processing

Silke Paulmann; Martin G. Bleichner; Sonja A. Kotz

Previous research suggests that emotional prosody processing is a highly rapid and complex process. In particular, it has been shown that different basic emotions can be differentiated in an early event-related brain potential (ERP) component, the P200. Often, the P200 is followed by later long lasting ERPs such as the late positive complex. The current experiment set out to explore in how far emotionality and arousal can modulate these previously reported ERP components. In addition, we also investigated the influence of task demands (implicit vs. explicit evaluation of stimuli). Participants listened to pseudo-sentences (sentences with no lexical content) spoken in six different emotions or in a neutral tone of voice while they either rated the arousal level of the speaker or their own arousal level. Results confirm that different emotional intonations can first be differentiated in the P200 component, reflecting a first emotional encoding of the stimulus possibly including a valence tagging process. A marginal significant arousal effect was also found in this time-window with high arousing stimuli eliciting a stronger P200 than low arousing stimuli. The P200 component was followed by a long lasting positive ERP between 400 and 750 ms. In this late time-window, both emotion and arousal effects were found. No effects of task were observed in either time-window. Taken together, results suggest that emotion relevant details are robustly decoded during early processing and late processing stages while arousal information is only reliably taken into consideration at a later stage of processing.

Collaboration


Dive into the Silke Paulmann's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge