Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Henning Holle is active.

Publication


Featured researches published by Henning Holle.


Journal of Cognitive Neuroscience | 2007

The Role of Iconic Gestures in Speech Disambiguation: ERP Evidence

Henning Holle; Thomas C. Gunter

The present series of experiments explored the extent to which iconic gestures convey information not found in speech. Electroencephalogram (EEG) was recorded as participants watched videos of a person gesturing and speaking simultaneously. The experimental sentences contained an unbalanced homonym in the initial part of the sentence (e.g., She controlled the ball ) and were disambiguated at a target word in the subsequent clause (which during the game vs. which during the dance ). Coincident with the initial part of the sentence, the speaker produced an iconic gesture which supported either the dominant or the subordinate meaning. Event-related potentials were time-locked to the onset of the target word. In Experiment 1, participants were explicitly asked to judge the congruency between the initial homonym-gesture combination and the subsequent target word. The N400 at target words was found to be smaller after a congruent gesture and larger after an incongruent gesture, suggesting that listeners can use gestural information to disambiguate speech. Experiment 2 replicated the results using a less explicit task, indicating that the disambiguating effect of gesture is somewhat task-independent. Unrelated grooming movements were added to the paradigm in Experiment 3. The N400 at subordinate targets was found to be smaller after subordinate gestures and larger after dominant gestures as well as grooming, indicating that an iconic gesture can facilitate the processing of a lesser frequent word meaning. The N400 at dominant targets no longer varied as a function of the preceding gesture in Experiment 3, suggesting that the addition of meaningless movements weakened the impact of gesture. Thus, the integration of gesture and speech in comprehension does not appear to be an obligatory process but is modulated by situational factors such as the amount of observed meaningful hand movements.


NeuroImage | 2008

Neural correlates of the processing of co-speech gestures

Henning Holle; Thomas C. Gunter; Shirley-Ann Rüschemeyer; Andreas Hennenlotter; Marco Iacoboni

In communicative situations, speech is often accompanied by gestures. For example, speakers tend to illustrate certain contents of speech by means of iconic gestures which are hand movements that bear a formal relationship to the contents of speech. The meaning of an iconic gesture is determined both by its form as well as the speech context in which it is performed. Thus, gesture and speech interact in comprehension. Using fMRI, the present study investigated what brain areas are involved in this interaction process. Participants watched videos in which sentences containing an ambiguous word (e.g. She touched the mouse) were accompanied by either a meaningless grooming movement, a gesture supporting the more frequent dominant meaning (e.g. animal) or a gesture supporting the less frequent subordinate meaning (e.g. computer device). We hypothesized that brain areas involved in the interaction of gesture and speech would show greater activation to gesture-supported sentences as compared to sentences accompanied by a meaningless grooming movement. The main results are that when contrasted with grooming, both types of gestures (dominant and subordinate) activated an array of brain regions consisting of the left posterior superior temporal sulcus (STS), the inferior parietal lobule bilaterally and the ventral precentral sulcus bilaterally. Given the crucial role of the STS in audiovisual integration processes, this activation might reflect the interaction between the meaning of gesture and the ambiguous sentence. The activations in inferior frontal and inferior parietal regions may reflect a mechanism of determining the goal of co-speech hand movements through an observation-execution matching process.


NeuroImage | 2010

Integration of iconic gestures and speech in left superior temporal areas boosts speech comprehension under adverse listening conditions

Henning Holle; Jonas Obleser; Shirley-Ann Rueschemeyer; Thomas C. Gunter

Iconic gestures are spontaneous hand movements that illustrate certain contents of speech and, as such, are an important part of face-to-face communication. This experiment targets the brain bases of how iconic gestures and speech are integrated during comprehension. Areas of integration were identified on the basis of two classic properties of multimodal integration, bimodal enhancement and inverse effectiveness (i.e., greater enhancement for unimodally least effective stimuli). Participants underwent fMRI while being presented with videos of gesture-supported sentences as well as their unimodal components, which allowed us to identify areas showing bimodal enhancement. Additionally, we manipulated the signal-to-noise ratio of speech (either moderate or good) to probe for integration areas exhibiting the inverse effectiveness property. Bimodal enhancement was found at the posterior end of the superior temporal sulcus and adjacent superior temporal gyrus (pSTS/STG) in both hemispheres, indicating that the integration of iconic gestures and speech takes place in these areas. Furthermore, we found that the left pSTS/STG specifically showed a pattern of inverse effectiveness, i.e., the neural enhancement for bimodal stimulation was greater under adverse listening conditions. This indicates that activity in this area is boosted when an iconic gesture accompanies an utterance that is otherwise difficult to comprehend. The neural response paralleled the behavioral data observed. The present data extends results from previous gesture-speech integration studies in showing that pSTS/STG plays a key role in the facilitation of speech comprehension through simultaneous gestural input.


Proceedings of the National Academy of Sciences of the United States of America | 2012

Neural basis of contagious itch and why some people are more prone to it.

Henning Holle; Kimberley Warne; Anil K. Seth; Hugo D. Critchley; Jamie Ward

Watching someone scratch himself can induce feelings of itchiness in the perceiver. This provides a unique opportunity to characterize the neural basis of subjective experiences of itch, independent of changes in peripheral inputs. In this study, we first established that the social contagion of itch is essentially a normative response (experienced by most people), and that the degree of contagion is related to trait differences in neuroticism (i.e., the tendency to experience negative emotions), but not to empathy. Watching video clips of someone scratching (relative to control videos of tapping) activated, as indicated by functional neuroimaging, many of the neural regions linked to the physical perception of itch, including anterior insular, primary somatosensory, and prefrontal (BA44) and premotor cortices. Moreover, activity in the left BA44, BA6, and primary somatosensory cortex was correlated with subjective ratings of itchiness, and the responsivity of the left BA44 reflected individual differences in neuroticism. Our findings highlight the central neural generation of the subjective experience of somatosensory perception in the absence of somatosensory stimulation. We speculate that the habitual activation of this central “itch matrix” may give rise to psychogenic itch disorders.


Cognitive Neuroscience | 2011

Proprioceptive drift without illusions of ownership for rotated hands in the “rubber hand illusion” paradigm

Henning Holle; Neil McLatchie; Stefanie Maurer; Jamie Ward

The rubber hand illusion is one reliable way to experimentally manipulate the experience of body ownership. However, debate continues about the necessary and sufficient conditions eliciting the illusion. We measured proprioceptive drift and the subjective experience (via questionnaire) while manipulating two variables that have been suggested to affect the intensity of the illusion. First, the rubber hand was positioned either in a posturally congruent position, or rotated by 180°. Second, either the anatomically same rubber hand was used, or an anatomically incongruent one. We found in two independent experiments that a rubber hand rotated by 180° leads to increased proprioceptive drift during synchronous visuo-tactile stroking, although it does not lead to feelings of ownership (as measured by questionnaire). This dissociation between drift and ownership suggests that proprioceptive drift is not necessarily a valid proxy for the illusion when using hands rotated by 180°.


Frontiers in Psychology | 2012

Gesture Facilitates the Syntactic Analysis of Speech

Henning Holle; Christian Obermeier; Maren Schmidt-Kassow; Angela D. Friederici; Jamie Ward; Thomas C. Gunter

Recent research suggests that the brain routinely binds together information from gesture and speech. However, most of this research focused on the integration of representational gestures with the semantic content of speech. Much less is known about how other aspects of gesture, such as emphasis, influence the interpretation of the syntactic relations in a spoken message. Here, we investigated whether beat gestures alter which syntactic structure is assigned to ambiguous spoken German sentences. The P600 component of the Event Related Brain Potential indicated that the more complex syntactic structure is easier to process when the speaker emphasizes the subject of a sentence with a beat. Thus, a simple flick of the hand can change our interpretation of who has been doing what to whom in a spoken sentence. We conclude that gestures and speech are integrated systems. Unlike previous studies, which have shown that the brain effortlessly integrates semantic information from gesture and speech, our study is the first to demonstrate that this integration also occurs for syntactic information. Moreover, the effect appears to be gesture-specific and was not found for other stimuli that draw attention to certain parts of speech, including prosodic emphasis, or a moving visual stimulus with the same trajectory as the gesture. This suggests that only visual emphasis produced with a communicative intention in mind (that is, beat gestures) influences language comprehension, but not a simple visual movement lacking such an intention.


Consciousness and Cognition | 2011

That's not a real body: identifying stimulus qualities that modulate synaesthetic experiences of touch.

Henning Holle; Michael J. Banissy; Tom F Wright; Natalie C. Bowling; Jamie Ward

Mirror-touch synaesthesia is a condition where observing touch to anothers body induces a subjective tactile sensation on the synaesthetes body. The present study explores which characteristics of the inducing stimulus modulate the synaesthetic touch experience. Fourteen mirror-touch synaesthetes watched videos depicting a touch event while indicating (i) whether the video induced a tactile sensation, (ii) on which side of their body they felt this sensation and (iii) the intensity of the experienced sensation. Results indicate that the synaesthetes experience stronger tactile sensations when observing touch to real bodies, whereas observing touch to dummy bodies, pictures of bodies and disconnected dummy body parts elicited weaker sensations. These results suggest that mirror-touch synaesthesia is not entirely bottom-up driven, but top-down information, such as knowledge about real and dummy body parts, also modulate the intensity of the experience.


Journal of Cognitive Neuroscience | 2011

What iconic gesture fragments reveal about gesture-speech integration: When synchrony is lost, memory can help

Christian Obermeier; Henning Holle; Thomas C. Gunter

The present series of experiments explores several issues related to gesture–speech integration and synchrony during sentence processing. To be able to more precisely manipulate gesture–speech synchrony, we used gesture fragments instead of complete gestures, thereby avoiding the usual long temporal overlap of gestures with their coexpressive speech. In a pretest, the minimal duration of an iconic gesture fragment needed to disambiguate a homonym (i.e., disambiguation point) was therefore identified. In three subsequent ERP experiments, we then investigated whether the gesture information available at the disambiguation point has immediate as well as delayed consequences on the processing of a temporarily ambiguous spoken sentence, and whether these gesture–speech integration processes are susceptible to temporal synchrony. Experiment 1, which used asynchronous stimuli as well as an explicit task, showed clear N400 effects at the homonym as well as at the target word presented further downstream, suggesting that asynchrony does not prevent integration under explicit task conditions. No such effects were found when asynchronous stimuli were presented using a more shallow task (Experiment 2). Finally, when gesture fragment and homonym were synchronous, similar results as in Experiment 1 were found, even under shallow task conditions (Experiment 3). We conclude that when iconic gesture fragments and speech are in synchrony, their interaction is more or less automatic. When they are not, more controlled, active memory processes are necessary to be able to combine the gesture fragment and speech context in such a way that the homonym is disambiguated correctly.


NeuroImage | 2016

Hand gestures as visual prosody: BOLD responses to audio–visual alignment are modulated by the communicative nature of the stimuli

Emmanuel Biau; Luis Morís Fernández; Henning Holle; César Ávila; Salvador Soto-Faraco

During public addresses, speakers accompany their discourse with spontaneous hand gestures (beats) that are tightly synchronized with the prosodic contour of the discourse. It has been proposed that speech and beat gestures originate from a common underlying linguistic process whereby both speech prosody and beats serve to emphasize relevant information. We hypothesized that breaking the consistency between beats and prosody by temporal desynchronization, would modulate activity of brain areas sensitive to speech-gesture integration. To this aim, we measured BOLD responses as participants watched a natural discourse where the speaker used beat gestures. In order to identify brain areas specifically involved in processing hand gestures with communicative intention, beat synchrony was evaluated against arbitrary visual cues bearing equivalent rhythmic and spatial properties as the gestures. Our results revealed that left MTG and IFG were specifically sensitive to speech synchronized with beats, compared to the arbitrary vision-speech pairing. Our results suggest that listeners confer beats a function of visual prosody, complementary to the prosodic structure of speech. We conclude that the emphasizing function of beat gestures in speech perception is instantiated through a specialized brain network sensitive to the communicative intent conveyed by a speaker with his/her hands.


Neuroreport | 2010

The time course of lexical access in morphologically complex words

Henning Holle; Thomas C. Gunter; Dirk Koester

Compounding, the concatenation of words (e.g. dishwasher), is an important mechanism across many languages. This study investigated whether access of initial compound constituents occurs immediately or, alternatively, whether it is delayed until the last constituent (i.e. the head). Electroencephalogram was measured as participants listened to German two-constituent compounds. Both the initial as well as the following head constituent could consist of either a word or nonword, resulting in four experimental conditions. Results showed a larger N400 for initial nonword constituents, suggesting that lexical access was attempted before the head. Thus, this study provides direct evidence that lexical access of transparent compound constituents in German occurs immediately, and is not delayed until the compound head is encountered.

Collaboration


Dive into the Henning Holle's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert Rein

German Sport University Cologne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge