Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Antje B. M. Gerdes is active.

Publication


Featured researches published by Antje B. M. Gerdes.


NeuroImage | 2012

Test–retest reliability of evoked BOLD signals from a cognitive–emotive fMRI test battery

Michael M. Plichta; Adam J. Schwarz; Oliver Grimm; Katrin Morgen; Daniela Mier; Leila Haddad; Antje B. M. Gerdes; Carina Sauer; Heike Tost; Christine Esslinger; Peter Colman; Frederick Wilson; Peter Kirsch; Andreas Meyer-Lindenberg

Even more than in cognitive research applications, moving fMRI to the clinic and the drug development process requires the generation of stable and reliable signal changes. The performance characteristics of the fMRI paradigm constrain experimental power and may require different study designs (e.g., crossover vs. parallel groups), yet fMRI reliability characteristics can be strongly dependent on the nature of the fMRI task. The present study investigated both within-subject and group-level reliability of a combined three-task fMRI battery targeting three systems of wide applicability in clinical and cognitive neuroscience: an emotional (face matching), a motivational (monetary reward anticipation) and a cognitive (n-back working memory) task. A group of 25 young, healthy volunteers were scanned twice on a 3T MRI scanner with a mean test-retest interval of 14.6 days. FMRI reliability was quantified using the intraclass correlation coefficient (ICC) applied at three different levels ranging from a global to a localized and fine spatial scale: (1) reliability of group-level activation maps over the whole brain and within targeted regions of interest (ROIs); (2) within-subject reliability of ROI-mean amplitudes and (3) within-subject reliability of individual voxels in the target ROIs. Results showed robust evoked activation of all three tasks in their respective target regions (emotional task=amygdala; motivational task=ventral striatum; cognitive task=right dorsolateral prefrontal cortex and parietal cortices) with high effect sizes (ES) of ROI-mean summary values (ES=1.11-1.44 for the faces task, 0.96-1.43 for the reward task, 0.83-2.58 for the n-back task). Reliability of group level activation was excellent for all three tasks with ICCs of 0.89-0.98 at the whole brain level and 0.66-0.97 within target ROIs. Within-subject reliability of ROI-mean amplitudes across sessions was fair to good for the reward task (ICCs=0.56-0.62) and, dependent on the particular ROI, also fair-to-good for the n-back task (ICCs=0.44-0.57) but lower for the faces task (ICC=-0.02-0.16). In conclusion, all three tasks are well suited to between-subject designs, including imaging genetics. When specific recommendations are followed, the n-back and reward task are also suited for within-subject designs, including pharmaco-fMRI. The present study provides task-specific fMRI reliability performance measures that will inform the optimal use, powering and design of fMRI studies using comparable tasks.


NeuroImage | 2011

Auditory cortex activation is modulated by emotion: a functional near-infrared spectroscopy (fNIRS) study.

Michael M. Plichta; Antje B. M. Gerdes; Georg W. Alpers; Wilma Harnisch; Stephen J. Brill; Matthias J. Wieser; Andreas J. Fallgatter

Visual emotional stimuli evoke enhanced activation in early visual cortex areas which may help organisms to quickly detect biologically salient cues and initiate appropriate approach or avoidance behavior. Functional neuroimaging evidence for the modulation of other sensory modalities by emotion is scarce. Therefore, the aim of the present study was to test whether sensory facilitation by emotional cues can also be found in the auditory domain. We recorded auditory brain activation with functional near-infrared-spectroscopy (fNIRS), a non-invasive and silent neuroimaging technique, while participants were listening to standardized pleasant, unpleasant, and neutral sounds selected from the International Affective Digitized Sound System (IADS). Pleasant and unpleasant sounds led to increased auditory cortex activation as compared to neutral sounds. This is the first study to suggest that the enhanced activation of sensory areas in response to complex emotional stimuli is apparently not restricted to the visual domain but is also evident in the auditory domain.


American Journal of Orthodontics and Dentofacial Orthopedics | 2010

Impact of facial asymmetry in visual perception: A 3-dimensional data analysis

Philipp Meyer-Marcotty; Georg W. Alpers; Antje B. M. Gerdes; Angelika Stellzig-Eisenhauer

INTRODUCTION The aim of this controlled study was to analyze the degree and localization of 3-dimensional (3D) facial asymmetry in adult patients with cleft lip and palate (CLP) compared with a control group and its impact on the visual perception of faces. METHODS The degree of 3D asymmetry was analyzed with a novel method without landmarks in 18 adults with complete unilateral CLP and 18 adults without congenital anomalies. Furthermore, the CLP and control faces were rated for appearance, symmetry, and facial expression by 30 participants. RESULTS The results showed that adults with CLP had significantly greater asymmetry in their facial soft tissues compared with the control group. Moreover, the lower face, and particularly the midface, had greater asymmetry in the CLP patients. The perceptual ratings showed that adults with CLP were judged much more negatively than those in the control group. CONCLUSIONS With sophisticated 3D analysis, the real morphology of a face can be calculated and asymmetric regions precisely identified. The greatest asymmetry in CLP patients is in the midface. These results underline the importance of symmetry in the perception of faces. In general, the greater the facial asymmetry near the midline of the face, the more negative the evaluation of the face in direct face-to-face interactions.


Social Cognitive and Affective Neuroscience | 2013

Why are you looking like that? How the context influences evaluation and processing of human faces

Katharina A. Schwarz; Matthias J. Wieser; Antje B. M. Gerdes; Andreas Mühlberger; Paul Pauli

Perception and evaluation of facial expressions are known to be heavily modulated by emotional features of contextual information. Such contextual effects, however, might also be driven by non-emotional aspects of contextual information, an interaction of emotional and non-emotional factors, and by the observers’ inherent traits. Therefore, we sought to assess whether contextual information about self-reference in addition to information about valence influences the evaluation and neural processing of neutral faces. Furthermore, we investigated whether social anxiety moderates these effects. In the present functional magnetic resonance imaging (fMRI) study, participants viewed neutral facial expressions preceded by a contextual sentence conveying either positive or negative evaluations about the participant or about somebody else. Contextual influences were reflected in rating and fMRI measures, with strong effects of self-reference on brain activity in the medial prefrontal cortex and right fusiform gyrus. Additionally, social anxiety strongly affected the response to faces conveying negative, self-related evaluations as revealed by the participants’ rating patterns and brain activity in cortical midline structures and regions of interest in the left and right middle frontal gyrus. These results suggest that face perception and processing are highly individual processes influenced by emotional and non-emotional aspects of contextual information and further modulated by individual personality traits.


Journal of Neural Transmission | 2009

Attention and amygdala activity: an fMRI study with spider pictures in spider phobia

Georg W. Alpers; Antje B. M. Gerdes; Bernadette Lagarie; Katharina Tabbert; Dieter Vaitl; Rudolf Stark

Facilitated detection of threatening visual cues is thought to be adaptive. In theory, detection of threat cues should activate the amygdala independently from allocation of attention. However, previous studies using emotional facial expressions as well as phobic cues yielded contradictory results. We used fMRI to examine whether the allocation of attention to components of superimposed spider and bird displays modulates amygdala activation. Nineteen spider-phobic women were instructed to identify either a moving or a stationary animal in briefly presented double-exposure displays. Amygdala activation followed a dose–response relationship: Compared to congruent neutral displays (two birds), amygdala activation was most pronounced in response to congruent phobic displays (two spiders) and less but still significant in response to mixed displays (spider and bird) when attention was focused on the phobic component. When attention was focused on the neutral component, mixed displays did not result in significant amygdala activation. This was confirmed in a significant parametric graduation of the amygdala activation in the order of congruent phobic displays, mixed displays with attention focus on the spider, mixed displays with focus on the bird and congruent neutral displays. These results challenge the notion that amygdala activation in response to briefly presented phobic cues is independent from attention.


Frontiers in Human Neuroscience | 2010

Brain Activations to Emotional Pictures are Differentially Associated with Valence and Arousal Ratings

Antje B. M. Gerdes; Matthias J. Wieser; Andreas Mühlberger; Peter Weyers; Georg W. Alpers; Michael M. Plichta; Felix A. Breuer; Paul Pauli

Several studies have investigated the neural responses triggered by emotional pictures, but the specificity of the involved structures such as the amygdala or the ventral striatum is still under debate. Furthermore, only few studies examined the association of stimulis valence and arousal and the underlying brain responses. Therefore, we investigated brain responses with functional magnetic resonance imaging of 17 healthy participants to pleasant and unpleasant affective pictures and afterwards assessed ratings of valence and arousal. As expected, unpleasant pictures strongly activated the right and left amygdala, the right hippocampus, and the medial occipital lobe, whereas pleasant pictures elicited significant activations in left occipital regions, and in parts of the medial temporal lobe. The direct comparison of unpleasant and pleasant pictures, which were comparable in arousal clearly indicated stronger amygdala activation in response to the unpleasant pictures. Most important, correlational analyses revealed on the one hand that the arousal of unpleasant pictures was significantly associated with activations in the right amygdala and the left caudate body. On the other hand, valence of pleasant pictures was significantly correlated with activations in the right caudate head, extending to the nucleus accumbens (NAcc) and the left dorsolateral prefrontal cortex. These findings support the notion that the amygdala is primarily involved in processing of unpleasant stimuli, particularly to more arousing unpleasant stimuli. Reward-related structures like the caudate and NAcc primarily respond to pleasant stimuli, the stronger the more positive the valence of these stimuli is.


Biological Psychiatry | 2009

Abnormal affective responsiveness in attention-deficit/hyperactivity disorder: subtype differences.

Annette Conzelmann; Ronald F. Mucha; Christian Jacob; Peter Weyers; Jasmin Romanos; Antje B. M. Gerdes; Christina G. Baehne; Andrea Boreatti-Hümmer; Monika Heine; Georg W. Alpers; Andreas Warnke; Andreas J. Fallgatter; Klaus-Peter Lesch; Paul Pauli

BACKGROUND Emotional-motivational dysfunctions likely contribute to attention-deficit/hyperactivity disorder (ADHD), especially to hyperactive and impulsive symptoms. This study examined the affective modulation of the startle reflex in a large sample of ADHD patients. The aim was to compare subtypes of ADHD. METHODS One hundred ninety-seven unmedicated adult ADHD patients (127 combined type [ADHD-C]; 50 inattentive type [ADHD-I]; 20 hyperactive-impulsive type [ADHD-HI]) and 128 healthy control subjects were examined. The affect-modulated startle response as well as valence and arousal ratings were assessed for pleasant, neutral, and unpleasant picture stimuli. RESULTS Control subjects exhibited startle response attenuation and potentiation by pleasant and unpleasant pictures, respectively. In ADHD-HI, startle response was not attenuated by pleasant and not potentiated by unpleasant stimuli. In ADHD-C, startle response was not attenuated by pleasant pictures, and ADHD-I responded similar to control subjects but startle response was attenuated to a lesser degree by pleasant stimuli. The ADHD-HI group rated all pictures as more positive, and male ADHD-HI rated unpleasant stimuli as less arousing. CONCLUSIONS This is the first study to assess the affect-modulated startle response in ADHD. It confirms emotional dysfunctions in these patients; all subtypes showed more or less diminished emotional reactions to pleasant stimuli. The hyperactive-impulsive type was also marked by blunted reactions to unpleasant stimuli. Results suggest that response patterns to emotional cues or reward may help to differentiate ADHD subtypes. Blunted emotional reactivity is especially pronounced in ADHD patients with symptoms of hyperactivity and impulsivity (ADHD-C, ADHD-HI).


Journal of Dental Research | 2010

Persons with Cleft Lip and Palate are Looked at Differently

Philipp Meyer-Marcotty; Antje B. M. Gerdes; Tobias Reuther; Angelika Stellzig-Eisenhauer; Georg W. Alpers

There is evidence that persons with cleft lip and palate (CLP) suffer psychosocial consequences as a result of their facial appearance. However, no data exist on how they are perceived by others. Our hypothesis was that CLP faces were looked at differently compared with faces lacking an anomaly. Eye movements of 30 healthy participants were recorded (via an eye-tracking camera) while they viewed photographs of faces with/without a CLP. Subsequently, the faces were rated for appearance, symmetry, and facial expression. When the CLP faces were viewed, there were significantly more initial fixations in the mouth and longer fixations in the mouth and nose regions, compared with reactions when control faces were viewed. Moreover, CLP faces were rated more negatively overall. When faces with CLP were viewed, attention was directed to the mouth and nose region. Together with the negative ratings, this may explain at least some of the social deprivations in persons with CLP, probably due to residual asymmetry.


Social Cognitive and Affective Neuroscience | 2011

Stop looking angry and smile, please: start and stop of the very same facial expression differentially activate threat- and reward-related brain networks

Andreas Mühlberger; Matthias J. Wieser; Antje B. M. Gerdes; Monika C.M. Frey; Peter Weyers; Paul Pauli

Static pictures of emotional facial expressions have been found to activate brain structures involved in the processing of emotional stimuli. However, in everyday live, emotional expressions are changing rapidly, and the processing of the onset vs the offset of the very same emotional expression might rely on different brain networks, presumably leading to different behavioral and physiological reactions (e.g. approach or avoidance). Using functional magnetic resonance imaging, this was examined by presenting video clips depicting onsets and offsets of happy and angry facial expressions. Subjective valence and threat ratings clearly depended on the direction of change. Blood oxygen level dependent responses indicate both reward- and threat-related activations for the offset of angry expressions. Comparing onsets and offsets, angry offsets were associated with stronger ventral striatum activation than angry onsets. Additionally, the offset of happy and the onset of angry expressions showed strong common activity in the lateral orbitofrontal cortex bilaterally, the left amygdala and the left insula, whereas the onset of happy and the offset of angry expressions induced significant activation in the left dorsal striatum. In sum, the results confirm different activity in motivation-related brain areas in response to the onset and offset of the same emotional expression and highlight the importance of temporal characteristics of facial expressions for social communication.


Frontiers in Human Neuroscience | 2014

Social and emotional relevance in face processing: happy faces of future interaction partners enhance the late positive potential

Florian Bublatzky; Antje B. M. Gerdes; Andrew J. White; Martin Riemer; Georg W. Alpers

Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP) to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. To implement a social anticipation task, relevance was manipulated by presenting faces of two specific actors as future interaction partners (socially relevant), whereas two other face actors remained non-relevant. In a further control task all stimuli were presented without specific relevance instructions (passive viewing). Face stimuli of four actors (2 women, from the KDEF) were randomly presented for 1s to 26 participants (16 female). Results showed an augmented N170, early posterior negativity (EPN), and late positive potential (LPP) for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of experimental tasks. Whereas task effects were observed for P1 and EPN regardless of instructed relevance, LPP amplitudes were modulated by emotional facial expression and relevance manipulation. The LPP was specifically enhanced for happy facial expressions of the anticipated future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

Collaboration


Dive into the Antje B. M. Gerdes's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul Pauli

University of Würzburg

View shared research outputs
Top Co-Authors

Avatar

Matthias J. Wieser

Erasmus University Rotterdam

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Weyers

University of Würzburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge