Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bernard M. C. Stienen is active.

Publication


Featured researches published by Bernard M. C. Stienen.


Current Biology | 2008

Intact navigation skills after bilateral loss of striate cortex

Beatrice de Gelder; Marco Tamietto; Geert J. M. van Boxtel; Rainer Goebel; Arash Sahraie; Jan Van den Stock; Bernard M. C. Stienen; Lawrence Weiskrantz; Alan J. Pegna

A patient with bilateral damage to primary visual (striated) cortex has provided the opportunity to assess just what visual capacities are possible in the absence of geniculo-striate pathways. Patient TN suffered two strokes in succession, lesioning each visual cortex in turn and causing clinical blindness over his whole visual field. Functional and anatomical brain imaging assessments showed that TN completely lacks any functional visual cortex. We report here that, among other retained abilities, he can successfully navigate down the extent of a long corridor in which various barriers were placed. A video recording shows him skillfully avoiding and turning around the blockages. This demonstrates that extra-striate pathways in humans can sustain sophisticated visuo-spatial skills in the absence of perceptual awareness, akin to what has been previously reported in monkeys. It remains to be determined which of the several extra-striate pathways account for TNs intact navigation skills.


Emotion | 2011

Fear Detection and Visual Awareness in Perceiving Bodily Expressions

Bernard M. C. Stienen; Beatrice de Gelder

Many research reports have concluded that emotional information can be processed without observers being aware of it. The case for perception without awareness has almost always been made with the use of facial expressions. In view of the similarities between facial and bodily expressions for rapid perception and communication of emotional signals, we conjectured that perception of bodily expressions may also not necessarily require visual awareness. Our study investigates the role of visual awareness in the perception of bodily expressions using a backward masking technique in combination with confidence ratings on a trial-by-trial basis. Participants had to detect in three separate experiments masked fearful, angry and happy bodily expressions among masked neutral bodily actions as distractors and subsequently the participants had to indicate their confidence. The onset between target and mask (Stimulus Onset Asynchrony, SOA) varied from -50 to +133 ms. Sensitivity measurements (d-prime) as well as the confidence of the participants showed that the bodies could be detected reliably in all SOA conditions. In an important finding, a lack of covariance was observed between the objective and subjective measurements when the participants had to detect fearful bodily expressions, yet this was not the case when participants had to detect happy or angry bodily expressions.


PLOS ONE | 2011

Emotional Voice and Emotional Body Postures Influence Each Other Independently of Visual Awareness

Bernard M. C. Stienen; Akihiro Tanaka; Beatrice de Gelder

Multisensory integration may occur independently of visual attention as previously shown with compound face-voice stimuli. We investigated in two experiments whether the perception of whole body expressions and the perception of voices influence each other when observers are not aware of seeing the bodily expression. In the first experiment participants categorized masked happy and angry bodily expressions while ignoring congruent or incongruent emotional voices. The onset between target and mask varied from −50 to +133 ms. Results show that the congruency between the emotion in the voice and the bodily expressions influences audiovisual perception independently of the visibility of the stimuli. In the second experiment participants categorized the emotional voices combined with masked bodily expressions as fearful or happy. This experiment showed that bodily expressions presented outside visual awareness still influence prosody perception. Our experiments show that audiovisual integration between bodily expressions and affective prosody can take place outside and independent of visual awareness.


Cortex | 2015

Virtual lesion of right posterior superior temporal sulcus modulates conscious visual perception of fearful expressions in faces and bodies

Matteo Candidi; Bernard M. C. Stienen; Salvatore Maria Aglioti; Beatrice de Gelder

The posterior Superior Temporal Suclus (pSTS) represents a central hub in the complex cerebral network for person perception and emotion recognition as also suggested by its heavy connections with face- and body-specific cortical (e.g., the fusiform face area, FFA and the extrastriate body area, EBA) and subcortical structures (e.g., amygdala). Information on whether pSTS is causatively involved in sustaining conscious visual perception of emotions expressed by faces and bodies is lacking. We explored this issue by combining a binocular rivalry procedure (where emotional and neutral face and body postures rivaled with house images) with off-line, 1-Hz repetitive transcranial magnetic stimulation (rTMS). We found that temporary inhibition of the right pSTS reduced perceptual dominance of fearful faces and increased perceptual dominance of fearful bodies, while leaving unaffected the perception of neutral face and body images. Inhibition of the vertex had no effect on conscious visual perception of neutral or emotional face or body stimuli. Thus, the right pSTS plays a causal role in shortening conscious vision of fearful faces and in prolonging conscious vision of fearful bodies. These results suggest that pSTS selectively modulates the activity of segregated networks involved in the conscious visual perception of emotional faces or bodies. We speculate that the opposite role of the right pSTS for conscious perception of fearful face and body may be explained by the different connections that this region entertains with face- and body-selective visual areas as well as with amygdalae and premotor regions.


Frontiers in Human Neuroscience | 2011

Fear modulates visual awareness similarly for facial and bodily expressions.

Bernard M. C. Stienen; Beatrice de Gelder

Background: Social interaction depends on a multitude of signals carrying information about the emotional state of others. But the relative importance of facial and bodily signals is still poorly understood. Past research has focused on the perception of facial expressions while perception of whole body signals has only been studied recently. In order to better understand the relative contribution of affective signals from the face only or from the whole body we performed two experiments using binocular rivalry. This method seems to be perfectly suitable to contrast two classes of stimuli to test our processing sensitivity to either stimulus and to address the question how emotion modulates this sensitivity. Method: In the first experiment we directly contrasted fearful, angry, and neutral bodies and faces. We always presented bodies in one eye and faces in the other simultaneously for 60 s and asked participants to report what they perceived. In the second experiment we focused specifically on the role of fearful expressions of faces and bodies. Results: Taken together the two experiments show that there is no clear bias toward either the face or body when the expression of the body and face are neutral or angry. However, the perceptual dominance in favor of either the face of the body is a function of the stimulus class expressing fear.


Archive | 2013

Emotions by Ear and by Eye

Beatrice de Gelder; Bernard M. C. Stienen; Jan Van den Stock

Multisensory integration must stand out among the fields of research that have witnessed one of the most impressive explosions of interest this last decade, at least as measured by published papers and meetings. From a highly specialized niche occupation multisensory research has become a mainstream scientific interest in a very short time span. One of these new areas of multisensory research is emotion. Since our first exploration of this phenomenon [de Gelder Bocker, Tuomainen, Hensen, & Vroomen Neuroscience Letters 260(2):133–136, 1999], a number of studies have appeared and they have used a wide variety of behavioral, neuropsychological, and neuroimaging methods.


Neural Computation | 2012

A computational feedforward model predicts categorization of masked emotional body language for longer, but not for shorter, latencies

Bernard M. C. Stienen; Konrad Schindler; Beatrice de Gelder

Given the presence of massive feedback loops in brain networks, it is difficult to disentangle the contribution of feedforward and feedback processing to the recognition of visual stimuli, in this case, of emotional body expressions. The aim of the work presented in this letter is to shed light on how well feedforward processing explains rapid categorization of this important class of stimuli. By means of parametric masking, it may be possible to control the contribution of feedback activity in human participants. A close comparison is presented between human recognition performance and the performance of a computational neural model that exclusively modeled feedforward processing and was engineered to fulfill the computational requirements of recognition. Results show that the longer the stimulus onset asynchrony (SOA), the closer the performance of the human participants was to the values predicted by the model, with an optimum at an SOA of 100 ms. At short SOA latencies, human performance deteriorated, but the categorization of the emotional expressions was still above baseline. The data suggest that, although theoretically, feedback arising from inferotemporal cortex is likely to be blocked when the SOA is 100 ms, human participants still seem to rely on more local visual feedback processing to equal the models performance.


Proceedings of the 7th International Conference on Methods and Techniques in Behavioral Research | 2010

Relative affective blindsight for fearful bodily expressions

Bernard M. C. Stienen; Beatrice de Gelder

Nonconscious affective perception has repeatedly been shown for facial expressions but not for bodily expressions while being highly salient and known to influence our behavior towards others. Lau & Passingham[7] found a case of relative blindsight using a parametric masking design. We used a comparable approach to find this relative case of blindsight for affective information in which the affective information can be processed independently of visual awareness. Participants had to detect masked fearful bodily expressions among masked neutral bodily actions as distractors and subsequently the participants had to indicate their confidence. The onset between target and mask (Stimulus Onset Asynchrony, SOA) varied from -50 to +133 milliseconds. D-prime as well as the confidence ratings showed that the bodies could be detected reliably in all SOA conditions. Importantly, a phenomenon which we coined relative affective blindsight was found, defined as two SOA conditions showing same d-prime values, while the confidence ratings differed.


The Journal of Neuroscience | 2011

Event-Related Repetitive Transcranial Magnetic Stimulation of Posterior Superior Temporal Sulcus Improves the Detection of Threatening Postural Changes in Human Bodies

Matteo Candidi; Bernard M. C. Stienen; Salvatore Maria Aglioti; Beatrice de Gelder


F1000Research | 2011

A computational feed-forward model predicts categorization of masked emotional body language for longer, but not for shorter latencies

Bernard M. C. Stienen; Konrad Schindler; Beatrice de Gelder

Collaboration


Dive into the Bernard M. C. Stienen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matteo Candidi

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jan Van den Stock

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge