Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christina Regenbogen is active.

Publication


Featured researches published by Christina Regenbogen.


The Journal of Neuroscience | 2012

Effective connectivity of the human cerebellum during visual attention.

Thilo Kellermann; Christina Regenbogen; Maarten De Vos; Carolin Mößnang; Andreas Finkelmeyer; Ute Habel

Insights from both lesion and neuroimaging studies increasingly substantiate the view that the human cerebellum not only serves motor control but also supports various cognitive processes. Higher cognitive functions like working memory or executive control have been associated with the phylogenetically younger parts of the cerebellum, crus I and crus II. Functional connectivity studies corroborate this notion as activation of the cerebellum correlates with activity in numerous areas of the cerebral cortex. Moreover, these cerebrocerebellar loops were shown to be topographically organized. We used an attention-to-motion paradigm to elaborate on the effective connectivity of cerebellar crus I during visual attention. Psychophysiological interaction analyses demonstrated enhanced connectivity of the cerebellum—during attention—with dorsal visual stream regions including posterior parietal cortex (PPC) and left secondary visual cortex (V5). Dynamic causal modeling revealed a modulation of the connections from V5 to PPC and from crus I to V5 by attention. Remarkably, the influence which V5 exerted on PPC was reduced during attention, resulting in a suppression of the sensitivity of PPC to bottom-up information. Moreover, the sensitivity of V5 populations to inputs from crus I was increased under attention. This might underscore the presumed role of the cerebellum as a state estimator that provides hierarchically lower regions (V5) with top-down predictions, which in turn might be based on endogenous inputs from PPC to the cerebellum. These results are in line with formulations of attention in predictive coding, where attention increases the precision or sensitivity of hierarchically lower neuronal populations that may encode prediction error.


Cognition & Emotion | 2012

The differential contribution of facial expressions, prosody, and speech content to empathy

Christina Regenbogen; Daniel A. Schneider; Andreas Finkelmeyer; Nils Kohn; Birgit Derntl; Thilo Kellermann; Raquel E. Gur; Frank Schneider; Ute Habel

Background: Facial expressions, prosody, and speech content constitute channels by which information is exchanged. Little is known about the simultaneous and differential contribution of these channels to empathy when they provide emotionality or neutrality. Especially neutralised speech content has gained little attention with regards to influencing the perception of other emotional cues. Methods: Participants were presented with video clips of actors telling short-stories. One condition conveyed emotionality in all channels while the other conditions either provided neutral speech content, facial expression, or prosody, respectively. Participants judged the emotion and intensity presented, as well as their own emotional state and intensity. Skin conductance served as a physiological measure of emotional reactivity. Results: Neutralising channels significantly reduced empathic responses. Electrodermal recordings confirmed these findings. The differential effect of the communication channels on empathy prerequisites was that target emotion recognition of the other decreased mostly when the face was neutral, whereas decreased emotional responses attributed to the target emotion were especially present in neutral speech. Conclusion: Multichannel integration supports conscious and autonomous measures of empathy and emotional reactivity. Emotional facial expressions influence emotion recognition, whereas speech content is important for responding with an adequate own emotional state, possibly reflecting contextual emotion-appraisal.


Pain | 2016

Brain activations during pain: a neuroimaging meta-analysis of patients with pain and healthy controls.

Karin B. Jensen; Christina Regenbogen; Margarete C. Ohse; Johannes Frasnelli; Jessica Freiherr; Johan N. Lundström

Abstract In response to recent publications from pain neuroimaging experiments, there has been a debate about the existence of a primary pain region in the brain. Yet, there are few meta-analyses providing assessments of the minimum cerebral denominators of pain. Here, we used a statistical meta-analysis method, called activation likelihood estimation, to define (1) core brain regions activated by pain per se, irrelevant of pain modality, paradigm, or participants and (2) activation likelihood estimation commonalities and differences between patients with chronic pain and healthy individuals. A subtraction analysis of 138 independent data sets revealed that the minimum denominator for activation across pain modalities and paradigms included the right insula, secondary sensory cortex, and right anterior cingulate cortex (ACC). Common activations for healthy subjects and patients with pain alike included the thalamus, ACC, insula, and cerebellum. A comparative analysis revealed that healthy individuals were more likely to activate the cingulum, thalamus, and insula. Our results point toward the central role of the insular cortex and ACC in pain processing, irrelevant of modality, body part, or clinical experience; thus, furthering the importance of ACC and insular activation as key regions for the human experience of pain.


Proceedings of the National Academy of Sciences of the United States of America | 2017

Behavioral and neural correlates to multisensory detection of sick humans

Christina Regenbogen; John Axelsson; Julie Lasselin; Danja Porada; Tina Sundelin; Moa Peter; Mats Lekander; Johan N. Lundström; Mats J. Olsson

Significance In the perpetual race between evolving organisms and pathogens, the human immune system has evolved to reduce the harm of infections. As part of such a system, avoidance of contagious individuals would increase biological fitness. The present study shows that we can detect both facial and olfactory cues of sickness in others just hours after experimental activation of their immune system. The study further demonstrates that multisensory integration of these olfactory and visual sickness cues is a crucial mechanism for how we detect and socially evaluate sick individuals. Thus, by motivating the avoidance of sick conspecifics, olfactory–visual cues, both in isolation and integrated, may be important parts of circuits handling imminent threats of contagion. Throughout human evolution, infectious diseases have been a primary cause of death. Detection of subtle cues indicating sickness and avoidance of sick conspecifics would therefore be an adaptive way of coping with an environment fraught with pathogens. This study determines how humans perceive and integrate early cues of sickness in conspecifics sampled just hours after the induction of immune system activation, and the underlying neural mechanisms for this detection. In a double-blind placebo-controlled crossover design, the immune system in 22 sample donors was transiently activated with an endotoxin injection [lipopolysaccharide (LPS)]. Facial photographs and body odor samples were taken from the same donors when “sick” (LPS-injected) and when “healthy” (saline-injected) and subsequently were presented to a separate group of participants (n = 30) who rated their liking of the presented person during fMRI scanning. Faces were less socially desirable when sick, and sick body odors tended to lower liking of the faces. Sickness status presented by odor and facial photograph resulted in increased neural activation of odor- and face-perception networks, respectively. A superadditive effect of olfactory–visual integration of sickness cues was found in the intraparietal sulcus, which was functionally connected to core areas of multisensory integration in the superior temporal sulcus and orbitofrontal cortex. Taken together, the results outline a disease-avoidance model in which neural mechanisms involved in the detection of disease cues and multisensory integration are vital parts.


Autism Research | 2013

Evidence for Gender‐Specific Endophenotypes in High‐Functioning Autism Spectrum Disorder During Empathy

Karla Schneider; Christina Regenbogen; Katharina Pauly; Anna Gossen; Daniel A. Schneider; Lea Mevissen; Tanja Maria Michel; Ruben C. Gur; Ute Habel; Frank Schneider

Despite remarkable behavioral gender differences in patients with autism spectrum disorder (ASD), and growing evidence for a diminished male : female ratio for the putative “male disorder” ASD, aspects of gender are not addressed accordingly in ASD research. Our study aims at filling this gap by exploring empathy abilities in a group of 28 patients with high‐functioning ASD and 28 gender‐, age‐ and education‐matched non‐autistic subjects, for the first time by means of functional neuroimaging (fMRI). In an event‐related fMRI paradigm, emotional (“E”) and neutral (“N”) video clips presented actors telling self‐related short stories. After each clip, participants were asked to indicate their own emotion and its intensity as well as the emotion and intensity perceived for the actor. Behaviorally, we found significantly less empathic responses in the overall ASD group compared with non‐autistic subjects, and inadequate emotion recognition for the neutral clips in the female ASD group compared with healthy women. Neurally, increased activation of the bilateral medial frontal gyrus was found in male patients compared with female patients, a pattern which was not present in the non‐autistic group. Additionally, autistic women exhibited decreased activation of midbrain and limbic regions compared with non‐autistic women, whereas there was no significant difference within the male group. While we did not find a fundamental empathic deficit in autistic patients, our data propose different ways of processing empathy in autistic men and women, suggesting stronger impairments in cognitive aspects of empathy/theory of mind for men, and alterations of social reciprocity for women. Autism Res 2013, 6: 506–521.


British Journal of Psychiatry | 2015

Neural responses to dynamic multimodal stimuli and pathology-specific impairments of social cognition in schizophrenia and depression

Christina Regenbogen; Thilo Kellermann; Janina Seubert; Daniel A. Schneider; Raquel E. Gur; Birgit Derntl; Frank Schneider; Ute Habel

BACKGROUND Individuals with schizophrenia and people with depression both show abnormal behavioural and neural responses when perceiving and responding to emotional stimuli, but pathology-specific differences and commonalities remain mostly unclear. AIMS To directly compare empathic responses to dynamic multimodal emotional stimuli in a group with schizophrenia and a group with depression, and to investigate their neural correlates using functional magnetic resonance imaging (fMRI). METHOD The schizophrenia group (n = 20), the depression group (n = 24) and a control group (n = 24) were presented with portrait-shot video clips expressing emotion through three possible communication channels: facial expression, prosody and content. Participants rated their own and the actors emotional state as an index of empathy. RESULTS Although no group differences were found in empathy ratings, characteristic differences emerged in the fMRI activation patterns. The schizophrenia group demonstrated aberrant activation patterns during the neutral speech content condition in regions implicated in multimodal integration and formation of semantic constructs. Those in the depression group were most affected during conditions with trimodal emotional and trimodal neutral stimuli, in key regions of the mentalising network. CONCLUSIONS Our findings reveal characteristic differences in patients with schizophrenia compared with those with depression in their cortical responses to dynamic affective stimuli. These differences indicate that impairments in responding to emotional stimuli may be caused by pathology-specific problems in social cognition.


PLOS ONE | 2012

Auditory Processing under Cross-Modal Visual Load Investigated with Simultaneous EEG-fMRI

Christina Regenbogen; Maarten De Vos; Stefan Debener; Bruce I. Turetsky; Carolin Mößnang; Andreas Finkelmeyer; Ute Habel; Irene Neuner; Thilo Kellermann

Cognitive task demands in one sensory modality (T1) can have beneficial effects on a secondary task (T2) in a different modality, due to reduced top-down control needed to inhibit the secondary task, as well as crossmodal spread of attention. This contrasts findings of cognitive load compromising a secondary modality’s processing. We manipulated cognitive load within one modality (visual) and studied the consequences of cognitive demands on secondary (auditory) processing. 15 healthy participants underwent a simultaneous EEG-fMRI experiment. Data from 8 participants were obtained outside the scanner for validation purposes. The primary task (T1) was to respond to a visual working memory (WM) task with four conditions, while the secondary task (T2) consisted of an auditory oddball stream, which participants were asked to ignore. The fMRI results revealed fronto-parietal WM network activations in response to T1 task manipulation. This was accompanied by significantly higher reaction times and lower hit rates with increasing task difficulty which confirmed successful manipulation of WM load. Amplitudes of auditory evoked potentials, representing fundamental auditory processing showed a continuous augmentation which demonstrated a systematic relation to cross-modal cognitive load. With increasing WM load, primary auditory cortices were increasingly deactivated while psychophysiological interaction results suggested the emergence of auditory cortices connectivity with visual WM regions. These results suggest differential effects of crossmodal attention on fundamental auditory processing. We suggest a continuous allocation of resources to brain regions processing primary tasks when challenging the central executive under high cognitive load.


Frontiers in Human Neuroscience | 2013

Connecting multimodality in human communication.

Christina Regenbogen; Ute Habel; Thilo Kellermann

A successful reciprocal evaluation of social signals serves as a prerequisite for social coherence and empathy. In a previous fMRI study we studied naturalistic communication situations by presenting video clips to our participants and recording their behavioral responses regarding empathy and its components. In two conditions, all three channels transported congruent emotional or neutral information, respectively. Three conditions selectively presented two emotional channels and one neutral channel and were thus bimodally emotional. We reported channel-specific emotional contributions in modality-related areas, elicited by dynamic video clips with varying combinations of emotionality in facial expressions, prosody, and speech content. However, to better understand the underlying mechanisms accompanying a naturalistically displayed human social interaction in some key regions that presumably serve as specific processing hubs for facial expressions, prosody, and speech content, we pursued a reanalysis of the data. Here, we focused on two different descriptions of temporal characteristics within these three modality-related regions [right fusiform gyrus (FFG), left auditory cortex (AC), left angular gyrus (AG) and left dorsomedial prefrontal cortex (dmPFC)]. By means of a finite impulse response (FIR) analysis within each of the three regions we examined the post-stimulus time-courses as a description of the temporal characteristics of the BOLD response during the video clips. Second, effective connectivity between these areas and the left dmPFC was analyzed using dynamic causal modeling (DCM) in order to describe condition-related modulatory influences on the coupling between these regions. The FIR analysis showed initially diminished activation in bimodally emotional conditions but stronger activation than that observed in neutral videos toward the end of the stimuli, possibly by bottom-up processes in order to compensate for a lack of emotional information. The DCM analysis instead showed a pronounced top-down control. Remarkably, all connections from the dmPFC to the three other regions were modulated by the experimental conditions. This observation is in line with the presumed role of the dmPFC in the allocation of attention. In contrary, all incoming connections to the AG were modulated, indicating its key role in integrating multimodal information and supporting comprehension. Notably, the input from the FFG to the AG was enhanced when facial expressions conveyed emotional information. These findings serve as preliminary results in understanding network dynamics in human emotional communication and empathy.


Neuropsychologia | 2016

Bayesian-based integration of multisensory naturalistic perithreshold stimuli

Christina Regenbogen; Emilia Johansson; Patrik Andersson; Mats J. Olsson; Johan N. Lundström

Most studies exploring multisensory integration have used clearly perceivable stimuli. According to the principle of inverse effectiveness, the added neural and behavioral benefit of integrating clear stimuli is reduced in comparison to stimuli with degraded and less salient unisensory information. Traditionally, speed and accuracy measures have been analyzed separately with few studies merging these to gain an understanding of speed-accuracy trade-offs in multisensory integration. In two separate experiments, we assessed multisensory integration of naturalistic audio-visual objects consisting of individually-tailored perithreshold dynamic visual and auditory stimuli, presented within a multiple-choice task, using a Bayesian Hierarchical Drift Diffusion Model that combines response time and accuracy. For both experiments, unisensory stimuli were degraded to reach a 75% identification accuracy level for all individuals and stimuli to promote multisensory binding. In Experiment 1, we subsequently presented uni- and their respective bimodal stimuli followed by a 5-alternative-forced-choice task. In Experiment 2, we controlled for low-level integration and attentional differences. Both experiments demonstrated significant superadditive multisensory integration of bimodal perithreshold dynamic information. We present evidence that the use of degraded sensory stimuli may provide a link between previous findings of inverse effectiveness on a single neuron level and overt behavior. We further suggest that a combined measure of accuracy and reaction time may be a more valid and holistic approach of studying multisensory integration and propose the application of drift diffusion models for studying behavioral correlates as well as brain-behavior relationships of multisensory integration.


Social Cognitive and Affective Neuroscience | 2016

Task-irrelevant fear enhances amygdala-FFG inhibition and decreases subsequent face processing

Barbara Schulte Holthausen; Ute Habel; Thilo Kellermann; Patrick David Schelenz; Frank Schneider; J. Christopher Edgar; Bruce I. Turetsky; Christina Regenbogen

Facial threat is associated with changes in limbic activity as well as modifications in the cortical face-related N170. It remains unclear if task-irrelevant threat modulates the response to a subsequent facial stimulus, and whether the amygdalas role in early threat perception is independent and direct, or modulatory. In 19 participants, crowds of emotional faces were followed by target faces and a rating task while simultaneous EEG-fMRI were recorded. In addition to conventional analyses, fMRI-informed EEG analyses and fMRI dynamic causal modeling (DCM) were performed. Fearful crowds reduced EEG N170 target face amplitudes and increased responses in a fMRI network comprising insula, amygdala and inferior frontal cortex. Multimodal analyses showed that amygdala response was present ∼60 ms before the right fusiform gyrus-derived N170. DCM indicated inhibitory connections from amygdala to fusiform gyrus, strengthened when fearful crowds preceded a target face. Results demonstrated the suppressing influence of task-irrelevant fearful crowds on subsequent face processing. The amygdala may be sensitive to task-irrelevant fearful crowds and subsequently strengthen its inhibitory influence on face-responsive fusiform N170 generators. This provides spatiotemporal evidence for a feedback mechanism of the amygdala by narrowing attention in order to focus on potential threats.

Collaboration


Dive into the Christina Regenbogen's collaboration.

Top Co-Authors

Avatar

Ute Habel

RWTH Aachen University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Johan N. Lundström

Monell Chemical Senses Center

View shared research outputs
Top Co-Authors

Avatar

Bruce I. Turetsky

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Raquel E. Gur

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge