Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dirk Wildgruber is active.

Publication


Featured researches published by Dirk Wildgruber.


Human Brain Mapping | 2001

Sensorimotor mapping of the human cerebellum: fMRI evidence of somatotopic organization.

Wolfgang Grodd; Ernst Hülsmann; Martin Lotze; Dirk Wildgruber; Michael Erb

Functional magnetic resonance imaging (fMRI) was employed to determine areas of activation in the cerebellar cortex in 46 human subjects during a series of motor tasks. To reduce the variance due to differences in individual anatomy, a specific transformational procedure for the cerebellum was introduced. The activation areas for movements of lips, tongue, hands, and feet were determined and found to be sharply confined to lobules and sublobules and their sagittal zones in the rostral and caudal spino‐cerebellar cortex. There was a clear symmetry mirroring at the midline. The activation mapped as two distinct homunculoid representations. One, a more extended representation, was located upside down in the superior cerebellum, and a second one, doubled and smaller, in the inferior cerebellum. The two representations were remarkably similar to those proposed by Snider and Eldred [ 1951 ] five decades ago. In the upper representation, an intralimb somatotopy for the right elbow, wrist, and fingers was revealed. The maps seem to confirm earlier electrophysiological findings of sagittal zones in animals. They differ, however, from micromapping reports on fractured somatotopic maps in the cerebellar cortex of mammals. We assume that the representations that we observed are not solely the result of spatial integration of hemodynamic events underlying the fMRI method and may reflect integration of afferent peripheral and central information in the cerebellar cortex. Hum. Brain Mapping 13:55–73, 2001.


NeuroImage | 2005

Identification of emotional intonation evaluated by fMRI.

Dirk Wildgruber; Axel Riecker; Ingo Hertrich; Michael Erb; Wolfgang Grodd; Thomas Ethofer; Hermann Ackermann

During acoustic communication among human beings, emotional information can be expressed both by the propositional content of verbal utterances and by the modulation of speech melody (affective prosody). It is well established that linguistic processing is bound predominantly to the left hemisphere of the brain. By contrast, the encoding of emotional intonation has been assumed to depend specifically upon right-sided cerebral structures. However, prior clinical and functional imaging studies yielded discrepant data with respect to interhemispheric lateralization and intrahemispheric localization of brain regions contributing to processing of affective prosody. In order to delineate the cerebral network engaged in the perception of emotional tone, functional magnetic resonance imaging (fMRI) was performed during recognition of prosodic expressions of five different basic emotions (happy, sad, angry, fearful, and disgusted) and during phonetic monitoring of the same stimuli. As compared to baseline at rest, both tasks yielded widespread bilateral hemodynamic responses within frontal, temporal, and parietal areas, the thalamus, and the cerebellum. A comparison of the respective activation maps, however, revealed comprehension of affective prosody to be bound to a distinct right-hemisphere pattern of activation, encompassing posterior superior temporal sulcus (Brodmann Area [BA] 22), dorsolateral (BA 44/45), and orbitobasal (BA 47) frontal areas. Activation within left-sided speech areas, in contrast, was observed during the phonetic task. These findings indicate that partially distinct cerebral networks subserve processing of phonetic and intonational information during speech perception.


Neurology | 2005

fMRI reveals two distinct cerebral networks subserving speech motor control

Axel Riecker; Krystyna A. Mathiak; Dirk Wildgruber; Michael Erb; Ingo Hertrich; Wolfgang Grodd; Hermann Ackermann

Background: There are few data on the cerebral organization of motor aspects of speech production and the pathomechanisms of dysarthric deficits subsequent to brain lesions and diseases. The authors used fMRI to further examine the neural basis of speech motor control. Methods and Results: In eight healthy volunteers, fMRI was performed during syllable repetitions synchronized to click trains (2 to 6 Hz; vs a passive listening task). Bilateral hemodynamic responses emerged at the level of the mesiofrontal and sensorimotor cortex, putamen/pallidum, thalamus, and cerebellum (two distinct activation spots at either side). In contrast, dorsolateral premotor cortex and anterior insula showed left-sided activation. Calculation of rate/response functions revealed a negative linear relationship between repetition frequency and blood oxygen level–dependent (BOLD) signal change within the striatum, whereas both cerebellar hemispheres exhibited a step-wise increase of activation at ∼3 Hz. Analysis of the temporal dynamics of the BOLD effect found the various cortical and subcortical brain regions engaged in speech motor control to be organized into two separate networks (medial and dorsolateral premotor cortex, anterior insula, and superior cerebellum vs sensorimotor cortex, basal ganglia, and inferior cerebellum). Conclusion: These data provide evidence for two levels of speech motor control bound, most presumably, to motor preparation and execution processes. They also help to explain clinical observations such as an unimpaired or even accelerated speaking rate in Parkinson disease and slowed speech tempo, which does not fall below a rate of 3 Hz, in cerebellar disorders.


Progress in Brain Research | 2006

Cerebral processing of linguistic and emotional prosody: fMRI studies.

Dirk Wildgruber; Hermann Ackermann; Benjamin Kreifelts; Thomas Ethofer

During acoustic communication in humans, information about a speakers emotional state is predominantly conveyed by modulation of the tone of voice (emotional or affective prosody). Based on lesion data, a right hemisphere superiority for cerebral processing of emotional prosody has been assumed. However, the available clinical studies do not yet provide a coherent picture with respect to interhemispheric lateralization effects of prosody recognition and intrahemispheric localization of the respective brain regions. To further delineate the cerebral network engaged in the perception of emotional tone, a series of experiments was carried out based upon functional magnetic resonance imaging (fMRI). The findings obtained from these investigations allow for the separation of three successive processing stages during recognition of emotional prosody: (1) extraction of suprasegmental acoustic information predominantly subserved by right-sided primary and higher order acoustic regions; (2) representation of meaningful suprasegmental acoustic sequences within posterior aspects of the right superior temporal sulcus; (3) explicit evaluation of emotional prosody at the level of the bilateral inferior frontal cortex. Moreover, implicit processing of affective intonation seems to be bound to subcortical regions mediating automatic induction of specific emotional reactions such as activation of the amygdala in response to fearful stimuli. As concerns lower level processing of the underlying suprasegmental acoustic cues, linguistic and emotional prosody seem to share the same right hemisphere neural resources. Explicit judgment of linguistic aspects of speech prosody, however, appears to be linked to left-sided language areas whereas bilateral orbitofrontal cortex has been found involved in explicit evaluation of emotional prosody. These differences in hemispheric lateralization effects might explain that specific impairments in nonverbal emotional communication subsequent to focal brain lesions are relatively rare clinical observations as compared to the more frequent aphasic disorders.


NeuroImage | 2007

Audiovisual integration of emotional signals in voice and face: An event-related fMRI study

Benjamin Kreifelts; Thomas Ethofer; Wolfgang Grodd; Michael Erb; Dirk Wildgruber

In a natural environment, non-verbal emotional communication is multimodal (i.e. speech melody, facial expression) and multifaceted concerning the variety of expressed emotions. Understanding these communicative signals and integrating them into a common percept is paramount to successful social behaviour. While many previous studies have focused on the neurobiology of emotional communication in the auditory or visual modality alone, far less is known about multimodal integration of auditory and visual non-verbal emotional information. The present study investigated this process using event-related fMRI. Behavioural data revealed that audiovisual presentation of non-verbal emotional information resulted in a significant increase in correctly classified stimuli when compared with visual and auditory stimulation. This behavioural gain was paralleled by enhanced activation in bilateral posterior superior temporal gyrus (pSTG) and right thalamus, when contrasting audiovisual to auditory and visual conditions. Further, a characteristic of these brain regions, substantiating their role in the emotional integration process, is a linear relationship between the gain in classification accuracy and the strength of the BOLD response during the bimodal condition. Additionally, enhanced effective connectivity between audiovisual integration areas and associative auditory and visual cortices was observed during audiovisual stimulation, offering further insight into the neural process accomplishing multimodal integration. Finally, we were able to document an enhanced sensitivity of the putative integration sites to stimuli with emotional non-verbal content as compared to neutral stimuli.


NeuroImage | 2006

Cerebral pathways in processing of affective prosody: A dynamic causal modeling study

Thomas Ethofer; Silke Anders; Michael Erb; Cornelia Herbert; Sarah Wiethoff; Johanna Kissler; Wolfgang Grodd; Dirk Wildgruber

This study was conducted to investigate the connectivity architecture of neural structures involved in processing of emotional speech melody (prosody). 24 subjects underwent event-related functional magnetic resonance imaging (fMRI) while rating the emotional valence of either prosody or semantics of binaurally presented adjectives. Conventional analysis of fMRI data revealed activation within the right posterior middle temporal gyrus and bilateral inferior frontal cortex during evaluation of affective prosody and left temporal pole, orbitofrontal, and medial superior frontal cortex during judgment of affective semantics. Dynamic causal modeling (DCM) in combination with Bayes factors was used to compare competing neurophysiological models with different intrinsic connectivity structures and input regions within the network of brain regions underlying comprehension of affective prosody. Comparison on group level revealed superiority of a model in which the right temporal cortex serves as input region as compared to models in which one of the frontal areas is assumed to receive external inputs. Moreover, models with parallel information conductance from the right temporal cortex were superior to models in which the two frontal lobes accomplish serial processing steps. In conclusion, connectivity analysis supports the view that evaluation of affective prosody requires prior analysis of acoustic features within the temporal and that transfer of information from the temporal cortex to the frontal lobes occurs via parallel pathways.


NeuroImage | 2002

Right-Hemispheric Organization of Language Following Early Left-Sided Brain Lesions: Functional MRI Topography

Martin Staudt; Karen Lidzba; Wolfgang Grodd; Dirk Wildgruber; Michael Erb; Ingeborg Krägeloh-Mann

Left-hemispheric (LH) brain lesions acquired early in life can induce language organization in the undamaged right hemisphere (RH). This study addresses the anatomical correlates of language processing in the RH of such individuals. Five hemiparetic patients with left periventricular brain lesions of pre- and perinatal origin were included, in whom fMRI during a word generation task had yielded predominantly RH activation; five age- and sex-matched healthy right-handers served as controls. The patterns of activation in the RH of patients showed a striking similarity with the LH patterns of the normal controls, and voxel-wise comparison failed to detect significant differences. This demonstrates that in patients with early LH damage, RH recruitment for language occurs in brain areas homotopic to the LH regions involved in language processing under normal circumstances.


NeuroImage | 2003

Parametric analysis of rate-dependent hemodynamic response functions of cortical and subcortical brain structures during auditorily cued finger tapping: a fMRI study.

Axel Riecker; Dirk Wildgruber; Klaus Mathiak; Wolfgang Grodd; Hermann Ackermann

A multitude of functional imaging studies revealed a mass activation effect at the level of the sensorimotor cortex during repetitive finger-tapping or finger-to-thumb opposition tasks in terms of either a stepwise or a monotonic relationship between movement rate and hemodynamic response. With respect to subcortical structures of the centralmotor system, there is, by contrast, some preliminary evidence for nonlinear rate/response functions within basal ganglia and cerebellum. To further specify these hemodynamic mechanisms, functional magnetic resonance imaging (fMRI) was performed during a finger-tapping task in response to acoustic stimuli (six different frequencies: 2.0, 2.5, 3.0, 4.0, 5.0 and 6.0 Hz; applied via headphones). Passive listening to the same auditory stimuli served as a control condition. Statistical evaluation of the obtained data considered two approaches: categorical and parametric analysis. As expected, the magnitude of the elicited hemodynamic response within left sensorimotor cortex (plateau phase at frequencies above 4 Hz) and mesiofrontal cortex paralleled movement rate. The observed bipartite mesial response pattern, most presumably, reflects functional compartmentalization of supplementary motor area (SMA) in a rostral component (pre-SMA) and in a caudal (SMA proper) component. At the level of the cerebellum, two significant hemodynamic responses within the hemisphere ipsilateral to the hand engaged into finger tapping (anterior/posterior quadrangular lobule and posterior quadrangular lobule) could be observed. Both activation foci exhibited a stepwise rate/response function. In accordance with clinical data, these data indicate different cerebellar contributions to motor control at frequencies below or above about 3 Hz, respectively. Caudate nucleus, putamen, and external pallidum of the left hemisphere displayed, by contrast, a negative linear rate/response relationship. The physiological significance of these latter findings remains to be clarified.


NeuroImage | 2008

Cerebral processing of emotional prosody--influence of acoustic parameters and arousal.

Sarah Wiethoff; Dirk Wildgruber; Benjamin Kreifelts; Hubertus G. T. Becker; Cornelia Herbert; Wolfgang Grodd; Thomas Ethofer

The human brain has a preference for processing of emotionally salient stimuli. In the auditory modality, emotional prosody can induce such involuntary biasing of processing resources. To investigate the neural correlates underlying automatic processing of emotional information in the voice, words spoken in neutral, happy, erotic, angry, and fearful prosody were presented in a passive-listening functional magnetic resonance imaging (fMRI) experiment. Hemodynamic responses in right mid superior temporal gyrus (STG) were significantly stronger for all emotional than for neutral intonations. To disentangle the contribution of basic acoustic features and emotional arousal to this activation, the relation between event-related responses and these parameters was evaluated by means of regression analyses. A significant linear dependency between hemodynamic responses of right mid STG and mean intensity, mean fundamental frequency, variability of fundamental frequency, duration, and arousal of the stimuli was observed. While none of the acoustic parameters alone explained the stronger responses of right mid STG to emotional relative to neutral prosody, this stronger responsiveness was abolished both by correcting for arousal or the conjoint effect of the acoustic parameters. In conclusion, our results demonstrate that right mid STG is sensitive to various emotions conveyed by prosody, an effect which is driven by a combination of acoustic features that express the emotional arousal in the speakers voice.


Neurology | 2001

Early left periventricular brain lesions induce right hemispheric organization of speech

Martin Staudt; Wolfgang Grodd; G. Niemann; Dirk Wildgruber; Michael Erb; Ingeborg Krägeloh-Mann

Right-hemispheric organization of speech has been observed following early left-sided brain lesions involving the language cortex. The authors studied speech organization in hemiparetic patients with pre- and perinatally acquired lesions in the left periventricular white matter using fMRI, and found that right-hemisphere activation correlated with left facial motor tract involvement. This suggests that the impairment of speech motor output from the left hemisphere plays an important role in this alteration of language representation.

Collaboration


Dive into the Dirk Wildgruber's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Heike Jacob

University of Tübingen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Uwe Klose

University of Tübingen

View shared research outputs
Researchain Logo
Decentralizing Knowledge