Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Beatrice de Gelder is active.

Publication


Featured researches published by Beatrice de Gelder.


Nature Reviews Neuroscience | 2010

Neural bases of the non-conscious perception of emotional signals.

Marco Tamietto; Beatrice de Gelder

Many emotional stimuli are processed without being consciously perceived. Recent evidence indicates that subcortical structures have a substantial role in this processing. These structures are part of a phylogenetically ancient pathway that has specific functional properties and that interacts with cortical processes. There is now increasing evidence that non-consciously perceived emotional stimuli induce distinct neurophysiological changes and influence behaviour towards the consciously perceived world. Understanding the neural bases of the non-conscious perception of emotional signals will clarify the phylogenetic continuity of emotion systems across species and the integration of cortical and subcortical activity in the human brain.


Nature Reviews Neuroscience | 2006

Towards the neurobiology of emotional body language

Beatrice de Gelder

Emotional body language is a rapidly emerging research field in cognitive neuroscience. de Gelder reviews the bodys role in our understanding of emotion, action and communication, and discusses similarities in the neuroanatomy and temporal dynamics between face and body perception.AbstractPeoples faces show fear in many different circumstances. However, when people are terrified, as well as showing emotion, they run for cover. When we see a bodily expression of emotion, we immediately know what specific action is associated with a particular emotion, leaving little need for interpretation of the signal, as is the case for facial expressions. Research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are automatically perceived and understood, and their role in emotional communication and decision-making.Peoples faces show fear in many different circumstances. However, when people are terrified, as well as showing emotion, they run for cover. When we see a bodily expression of emotion, we immediately know what specific action is associated with a particular emotion, leaving little need for interpretation of the signal, as is the case for facial expressions. Research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are automatically perceived and understood, and their role in emotional communication and decision-making.


Cognition & Emotion | 2000

The perception of emotions by ear and by eye

Beatrice de Gelder; Jean Vroomen

Emotions are expressed in the voice as well as on the face. As a first step to explore the question of their integration, we used a bimodal perception situation modelled after the McGurk paradigm, in which varying degrees of discordance can be created between the affects expressed in a face and in a tone of voice. Experiment 1 showed that subjects can effectively combine information from the two sources, in that identification of the emotion in the face is biased in the direction of the simultaneously presented tone of voice. Experiment 2 showed that this effect occurs also under instructions to base the judgement exclusively on the face. Experiment 3 showed the reverse effect, a bias from the emotion in the face on judgement of the emotion in the voice. These results strongly suggest the existence of mandatory bidirectional links between affect detection structures in vision and audition.


Journal of Cognitive Neuroscience | 2000

Hemispheric Asymmetries for Whole-Based and Part-Based Face Processing in the Human Fusiform Gyrus

Bruno Rossion; Laurence Dricot; Anne G. DeVolder; Jean-Michel Bodart; Marc Crommelinck; Beatrice de Gelder; Richard Zoontjes

Behavioral studies indicate a right hemisphere advantage for processing a face as a whole and a left hemisphere superiority for processing based on face features. The present PET study identifies the anatomical localization of these effects in well-defined regions of the middle fusiform gyri of both hemispheres. The right middle fusiform gyrus, previously described as a face-specific region, was found to be more activated when matching whole faces than face parts whereas this pattern of activity was reversed in the left homologous region. These lateralized differences appeared to be specific to faces since control objects processed either as wholes or parts did not induce any change of activity within these regions. This double dissociation between two modes of face processing brings new evidence regarding the lateralized localization of face individualization mechanisms in the human brain.


Journal of Experimental Psychology: Human Perception and Performance | 2000

Sound enhances visual perception : Cross-modal effects of auditory organization on vision

Jean Vroomen; Beatrice de Gelder

Six experiments demonstrated cross-modal influences from the auditory modality on the visual modality at an early level of perceptual organization. Participants had to detect a visual target in a rapidly changing sequence of visual distractors. A high tone embedded in a sequence of low tones improved detection of a synchronously presented visual target (Experiment 1), but the effect disappeared when the high tone was presented before the target (Experiment 2). Rhythmically based or order-based anticipation was unlikely to account for the effect because the improvement was unaffected by whether there was jitter (Experiment 3) or a random number of distractors between successive targets (Experiment 4). The facilitatory effect was greatly reduced when the tone was less abrupt and part of a melody (Experiments 5 and 6). These results show that perceptual organization in the auditory modality can have an effect on perceptibility in the visual modality.


Emotion | 2007

Body expressions influence recognition of emotions in the face and voice

Jan Van den Stock; Ruthger Righart; Beatrice de Gelder

The most familiar emotional signals consist of faces, voices, and whole-body expressions, but so far research on emotions expressed by the whole body is sparse. The authors investigated recognition of whole-body expressions of emotion in three experiments. In the first experiment, participants performed a body expression-matching task. Results indicate good recognition of all emotions, with fear being the hardest to recognize. In the second experiment, two alternative forced choice categorizations of the facial expression of a compound face-body stimulus were strongly influenced by the bodily expression. This effect was a function of the ambiguity of the facial expression. In the third experiment, recognition of emotional tone of voice was similarly influenced by task irrelevant emotional body expressions. Taken together, the findings illustrate the importance of emotional whole-body expressions in communication either when viewed on their own or, as is often the case in realistic circumstances, in combination with facial expressions and emotional voices.


Trends in Cognitive Sciences | 2003

Multisensory integration, perception and ecological validity

Beatrice de Gelder; Paul Bertelson

Studies of multimodal integration have relied to a large extent on conflict situations, in which two sensory modalities receive incongruent data concerning one aspect of the source. Exposure to such situations produces immediate crossmodal biases as well as longer lasting aftereffects, revealing recalibrations of data-to-percept matches. In the natural environment, such phenomena might be adaptive, by reducing the perturbing effects of factors like noise or growth-induced changes in receptor organs, and by enriching the percept. However, experimental results generalize to real life only when they reflect automatic perceptual processes, and not response strategies adopted to satisfy the particular demands of laboratory tasks. Here, we focus on this issue and review ways of addressing it that have been developed recently.


Neuroreport | 2004

The neural correlates of perceiving human bodies: an ERP study on the body-inversion effect.

Jeroen J. Stekelenburg; Beatrice de Gelder

The present study investigated the neural correlates of perceiving human bodies. Focussing on the N170 as an index of structural encoding, we recorded event-related potentials (ERPs) to images of bodies and faces (either neutral or expressing fear) and objects, while subjects viewed the stimuli presented either upright or inverted. The N170 was enhanced and delayed to inverted bodies and faces, but not to objects. The emotional content of faces affected the left N170, the occipito-parietal P2, and the fronto-central N2, whereas body expressions affected the frontal vertex positive potential (VPP) and a sustained fronto-central negativity (300–500 ms). Our results indicate that, like faces, bodies are processed configurally, and that within each category qualitative differences are observed for emotional as opposed to neutral images.


Attention Perception & Psychophysics | 2000

The ventriloquist effect does not depend on the direction of deliberate visual attention

Paul Bertelson; Jean Vroomen; Beatrice de Gelder; Jon Driver

It is well known that discrepancies in the location of synchronized auditory and visual events can lead to mislocalizations of the auditory source, so-called ventriloquism. In two experiments, we tested whether such cross-modal influences on auditory localization depend on deliberate visual attention to the biasing visual event. In Experiment 1, subjects pointed to the apparent source of sounds in the presence or absence of a synchronous peripheral flash. They also monitored for target visual events, either at the location of the peripheral flash or in a central location. Auditory localization was attracted toward the synchronous peripheral flash, but this was unaffected by where deliberate visual attention was directed in the monitoring task. In Experiment 2, bilateral flashes were presented in synchrony with each sound, to provide competing visual attractors. When these visual events were equally salient on the two sides, auditory localization was unaffected by which side subjects monitored for visual targets. When one flash was larger than the other, auditory localization was slightly but reliably attracted toward it, but again regardless of where visual monitoring was required. We conclude that ventriloquism largely reflects automatic sensory interactions, with little or no role for deliberate spatial attention.


European Journal of Cognitive Psychology | 1991

Face recognition and lip-reading in autism

Beatrice de Gelder; Jean Vroomen; Lucienne van der Heide

Abstract Autistic children individually matched for mental age with normal subjects were tested on memory for unfamiliar faces and on lip reading ability. The results show that autistic children are poorer than controls in memory for faces but comparable to controls in lip-reading. Autistic children show little influence on their auditory speech perception from visual speech. The results are discussed in relation to Bruce and Youngs (1986) model of face recognition. The independence between facial speech and memory for faces is in accordance with this model but is only observed in autistic subjects.

Collaboration


Dive into the Beatrice de Gelder's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jan Van den Stock

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Paul Bertelson

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julie Grèzes

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alan J. Pegna

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge