Sarah Jessen
Max Planck Society
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sarah Jessen.
Proceedings of the National Academy of Sciences of the United States of America | 2014
Sarah Jessen; Tobias Grossmann
Significance The human eye with its prominent white sclera is thought to facilitate social and cooperative interactions among humans. While there is evidence for brain mechanisms that allow for the unconscious detection of eye cues in adults, it is not known whether this ability of the human brain emerges early in ontogeny and can therefore be considered a key feature of human social functioning. The current study provides neural evidence for the unconscious detection of emotion and gaze cues from the sclera in 7-mo-old infants. Our findings demonstrate the existence of fast, efficient, and reliable social cue detection mechanisms in the human infant brain that likely provide a vital foundation for the development of social interactive skills. Human eyes serve two key functions in face-to-face social interactions: they provide cues about a person’s emotional state and attentional focus (gaze direction). Both functions critically rely on the morphologically unique human sclera and have been shown to operate even in the absence of conscious awareness in adults. However, it is not known whether the ability to respond to social cues from scleral information without conscious awareness exists early in human ontogeny and can therefore be considered a foundational feature of human social functioning. In the current study, we used event-related brain potentials (ERPs) to show that 7-mo-old infants discriminate between fearful and nonfearful eyes (experiment 1) and between direct and averted gaze (experiment 2), even when presented below the perceptual threshold. These effects were specific to the human sclera and not seen in response to polarity-inverted eyes. Our results suggest that early in ontogeny the human brain detects social cues from scleral information even in the absence of conscious awareness. The current findings support the view that the human eye with its prominent sclera serves critical communicative functions during human social interactions.
Cognitive, Affective, & Behavioral Neuroscience | 2016
Christian Obermeier; Sonja A. Kotz; Sarah Jessen; Tim Raettig; Martin von Koppenfels; Winfried Menninghaus
Rhetorical theory suggests that rhythmic and metrical features of language substantially contribute to persuading, moving, and pleasing an audience. A potential explanation of these effects is offered by “cognitive fluency theory,” which stipulates that recurring patterns (e.g., meter) enhance perceptual fluency and can lead to greater aesthetic appreciation. In this article, we explore these two assertions by investigating the effects of meter and rhyme in the reception of poetry by means of event-related brain potentials (ERPs). Participants listened to four versions of lyrical stanzas that varied in terms of meter and rhyme, and rated the stanzas for rhythmicity and aesthetic liking. The behavioral and ERP results were in accord with enhanced liking and rhythmicity ratings for metered and rhyming stanzas. The metered and rhyming stanzas elicited smaller N400/P600 ERP responses than their nonmetered, nonrhyming, or nonmetered and nonrhyming counterparts. In addition, the N400 and P600 effects for the lyrical stanzas correlated with aesthetic liking effects (metered–nonmetered), implying that modulation of the N400 and P600 has a direct bearing on the aesthetic appreciation of lyrical stanzas. We suggest that these effects are indicative of perceptual-fluency-enhanced aesthetic liking, as postulated by cognitive fluency theory.
Neuropsychologia | 2015
Sarah Jessen; Sonja A. Kotz
Emotion perception naturally entails multisensory integration. It is also assumed that multisensory emotion perception is characterized by enhanced activation of brain areas implied in multisensory integration, such as the superior temporal gyrus and sulcus (STG/STS). However, most previous studies have employed designs and stimuli that preclude other forms of multisensory interaction, such as crossmodal prediction, leaving open the question whether classical integration is the only relevant process in multisensory emotion perception. Here, we used video clips containing emotional and neutral body and vocal expressions to investigate the role of crossmodal prediction in multisensory emotion perception. While emotional multisensory expressions increased activation in the bilateral fusiform gyrus (FFG), neutral expressions compared to emotional ones enhanced activation in the bilateral middle temporal gyrus (MTG) and posterior STS. Hence, while neutral stimuli activate classical multisensory areas, emotional stimuli invoke areas linked to unisensory visual processing. Emotional stimuli may therefore trigger a prediction of upcoming auditory information based on prior visual information. Such prediction may be stronger for highly salient emotional compared to less salient neutral information. Therefore, we suggest that multisensory emotion perception involves at least two distinct mechanisms; classical multisensory integration, as shown for neutral expressions, and crossmodal prediction, as evident for emotional expressions.
Frontiers in Human Neuroscience | 2017
Sarah Jessen; Tobias Grossmann
Enhanced attention to fear expressions in adults is primarily driven by information from low as opposed to high spatial frequencies contained in faces. However, little is known about the role of spatial frequency information in emotion processing during infancy. In the present study, we examined the role of low compared to high spatial frequencies in the processing of happy and fearful facial expressions by using filtered face stimuli and measuring event-related brain potentials (ERPs) in 7-month-old infants (N = 26). Our results revealed that infants’ brains discriminated between emotional facial expressions containing high but not between expressions containing low spatial frequencies. Specifically, happy faces containing high spatial frequencies elicited a smaller Nc amplitude than fearful faces containing high spatial frequencies and happy and fearful faces containing low spatial frequencies. Our results demonstrate that already in infancy spatial frequency content influences the processing of facial emotions. Furthermore, we observed that fearful facial expressions elicited a comparable Nc response for high and low spatial frequencies, suggesting a robust detection of fearful faces irrespective of spatial frequency content, whereas the detection of happy facial expressions was contingent upon frequency content. In summary, these data provide new insights into the neural processing of facial emotions in early development by highlighting the differential role played by spatial frequencies in the detection of fear and happiness.
Journal of Psychophysiology | 2009
Silke Paulmann; Sarah Jessen; Sonja A. Kotz
Frontiers in Human Neuroscience | 2013
Sarah Jessen; Sonja A. Kotz
Cortex | 2015
Sarah Jessen; Tobias Grossmann
Neuropsychologia | 2012
Silke Paulmann; Sarah Jessen; Sonja A. Kotz
Journal of Experimental Child Psychology | 2016
Sarah Jessen; Tobias Grossmann
Cognition | 2016
Sarah Jessen; Nicole Altvater-Mackensen; Tobias Grossmann