Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Didier Maurice Grandjean is active.

Publication


Featured researches published by Didier Maurice Grandjean.


Emotion | 2008

Emotions evoked by the sound of music: Characterization, classification, and measurement.

Marcel Zentner; Didier Maurice Grandjean; Klaus R. Scherer

One reason for the universal appeal of music lies in the emotional rewards that music offers to its listeners. But what makes these rewards so special? The authors addressed this question by progressively characterizing music-induced emotions in 4 interrelated studies. Studies 1 and 2 (n=354) were conducted to compile a list of music-relevant emotion terms and to study the frequency of both felt and perceived emotions across 5 groups of listeners with distinct music preferences. Emotional responses varied greatly according to musical genre and type of response (felt vs. perceived). Study 3 (n=801)--a field study carried out during a music festival--examined the structure of music-induced emotions via confirmatory factor analysis of emotion ratings, resulting in a 9-factorial model of music-induced emotions. Study 4 (n=238) replicated this model and found that it accounted for music-elicited emotions better than the basic emotion and dimensional emotion models. A domain-specific device to measure musically induced emotions is introduced--the Geneva Emotional Music Scale.


Neural Networks | 2005

2005 Special Issue: A systems approach to appraisal mechanisms in emotion

David Sander; Didier Maurice Grandjean; Klaus R. Scherer

While artificial neural networks are regularly employed in modeling the perception of facial and vocal emotion expression as well as in automatic expression decoding by artificial agents, this approach is yet to be extended to the modeling of emotion elicitation and differentiation. In part, this may be due to the dominance of discrete and dimensional emotion models, which have not encouraged computational modeling. This situation has changed with the advent of appraisal theories of emotion and a number of attempts to develop rule-based models can be found in the literature. However, most of these models operate at a high level of conceptual abstraction and rarely include the underlying neural architecture. In this contribution, an appraisal-based emotion theory, the Component Process Model (CPM), is described that seems particularly suited to modeling with the help of artificial neural network approaches. This is due to its high degree of specificity in postulating underlying mechanisms including efferent physiological and behavioral manifestations as well as to the possibility of linking the theoretical assumptions to underlying neural architectures and dynamic processes. This paper provides a brief overview of the model, suggests constraints imposed by neural circuits, and provides examples on how the temporal unfolding of emotion can be conceptualized and experimentally tested. In addition, it is shown that the specific characteristics of emotion episodes can be profitably explored with the help of non-linear dynamic systems theory.


Nature Neuroscience | 2005

The voices of wrath: brain responses to angry prosody in meaningless speech

Didier Maurice Grandjean; David Sander; Gilles Pourtois; Sophie Schwartz; Mohamed L. Seghier; Klaus R. Scherer; Patrik Vuilleumier

We report two functional magnetic resonance imaging experiments showing enhanced responses in human middle superior temporal sulcus for angry relative to neutral prosody. This emotional enhancement was voice specific, unrelated to isolated acoustic amplitude or frequency cues in angry prosody, and distinct from any concomitant task-related attentional modulation. Attention and emotion seem to have separate effects on stimulus processing, reflecting a fundamental principle of human brain organization shared by voice and face perception.


NeuroImage | 2005

Emotion and attention interactions in social cognition: Brain regions involved in processing anger prosody

David Sander; Didier Maurice Grandjean; Gilles Pourtois; Sophie Schwartz; Mohamed L. Seghier; Klaus R. Scherer; Patrik Vuilleumier

Multiple levels of processing are thought to be involved in the appraisal of emotionally relevant events, with some processes being engaged relatively independently of attention, whereas other processes may depend on attention and current task goals or context. We conducted an event-related fMRI experiment to examine how processing angry voice prosody, an affectively and socially salient signal, is modulated by voluntary attention. To manipulate attention orthogonally to emotional prosody, we used a dichotic listening paradigm in which meaningless utterances, pronounced with either angry or neutral prosody, were presented simultaneously to both ears on each trial. In two successive blocks, participants selectively attended to either the left or right ear and performed a gender-decision on the voice heard on the target side. Our results revealed a functional dissociation between different brain areas. Whereas the right amygdala and bilateral superior temporal sulcus responded to anger prosody irrespective of whether it was heard from a to-be-attended or to-be-ignored voice, the orbitofrontal cortex and the cuneus in medial occipital cortex showed greater activation to the same emotional stimuli when the angry voice was to-be-attended rather than to-be-ignored. Furthermore, regression analyses revealed a strong correlation between orbitofrontal regions and sensitivity on a behavioral inhibition scale measuring proneness to anxiety reactions. Our results underscore the importance of emotion and attention interactions in social cognition by demonstrating that multiple levels of processing are involved in the appraisal of emotionally relevant cues in voices, and by showing a modulation of some emotional responses by both the current task-demands and individual differences.


Human Brain Mapping | 2005

Enhanced extrastriate visual response to bandpass spatial frequency filtered fearful faces: Time course and topographic evoked-potentials mapping

Gilles Pourtois; E. S. Dan; Didier Maurice Grandjean; David Sander; Patrik Vuilleumier

We compared electrical brain responses to fearful vs. neutral facial expressions in healthy volunteers while they performed an orthogonal gender decision task. Face stimuli either had a broadband spatial‐frequency content, or were filtered to create either low spatial‐frequency (LSF) or high spatial‐frequency (HSF) faces, always overlapped with their complementary SF content in upside‐down orientation to preserve the total stimulus energy. We tested the hypothesis that the coarse LSF content of faces might be responsible for an early modulation of event‐related potentials (ERPs) to fearful expressions. Consistent with previous findings, we show that broadband images of fearful faces, relative to neutral faces, elicit a higher global field power of approximately 130 ms poststimulus onset, corresponding to an increased P1 component over lateral occipital electrodes, with neural sources located within the extrastriate visual cortex. Bandpass filtering of faces strongly affected the latency and amplitude of ERPs, with a suppression of the normal N170 response for both LSF and HSF faces, irrespective of expression. Critically, we found that LSF information from fearful faces, unlike HSF information, produced a right‐lateralized enhancement of the lateral occipital P1, without any change in the scalp topography, relative to unfiltered (broadband) fearful faces. These results demonstrate that an early P1 response to fear expression depends on a visual pathway preferentially tuned to coarse‐magnocellular inputs, and can persist unchanged even when the N170 generators are disrupted by SF filtering. Hum Brain Mapp, 2005.


acm multimedia | 2006

Emotion assessment: arousal evaluation using EEG's and peripheral physiological signals

Guillaume Chanel; Julien Kronegg; Didier Maurice Grandjean; Thierry Pun

The arousal dimension of human emotions is assessed from two different physiological sources: peripheral signals and electroencephalographic (EEG) signals from the brain. A complete acquisition protocol is presented to build a physiological emotional database for real participants. Arousal assessment is then formulated as a classification problem, with classes corresponding to 2 or 3 degrees of arousal. The performance of 2 classifiers has been evaluated, on peripheral signals, on EEGs, and on both. Results confirm the possibility of using EEGs to assess the arousal component of emotion, and the interest of multimodal fusion between EEGs and peripheral physiological signals.


PLOS ONE | 2008

Individual Attachment Style Modulates Human Amygdala and Striatum Activation during Social Appraisal

Pascal Vrticka; Frédéric Andersson; Didier Maurice Grandjean; David Sander; Patrik Vuilleumier

Adult attachment style refers to individual personality traits that strongly influence emotional bonds and reactions to social partners. Behavioral research has shown that adult attachment style reflects profound differences in sensitivity to social signals of support or conflict, but the neural substrates underlying such differences remain unsettled. Using functional magnetic resonance imaging (fMRI), we examined how the three classic prototypes of attachment style (secure, avoidant, anxious) modulate brain responses to facial expressions conveying either positive or negative feedback about task performance (either supportive or hostile) in a social game context. Activation of striatum and ventral tegmental area was enhanced to positive feedback signaled by a smiling face, but this was reduced in participants with avoidant attachment, indicating relative impassiveness to social reward. Conversely, a left amygdala response was evoked by angry faces associated with negative feedback, and correlated positively with anxious attachment, suggesting an increased sensitivity to social punishment. Secure attachment showed mirror effects in striatum and amygdala, but no other specific correlate. These results reveal a critical role for brain systems implicated in reward and threat processing in the biological underpinnings of adult attachment style, and provide new support to psychological models that have postulated two separate affective dimensions to explain these individual differences, centered on the ventral striatum and amygdala circuits, respectively. These findings also demonstrate that brain responses to face expressions are not driven by facial features alone but determined by the personal significance of expressions in current social context. By linking fundamental psychosocial dimensions of adult attachment with brain function, our results do not only corroborate their biological bases but also help understand their impact on behavior.


Journal of Neuroscience Methods | 2007

Spatial frequencies or emotional effects? A systematic measure of spatial frequencies for IAPS pictures by a discrete wavelet analysis

Sylvain Delplanque; Karim Babacar Joseph Ndiaye; Klaus R. Scherer; Didier Maurice Grandjean

The influence of the emotional attributes of a visual scene on early perception processes remains a key question in contemporary affective neurosciences. The International Affective Picture System (IAPS; Lang et al., 2005) was developed to provide a set of standardized stimuli for experimental investigations of emotional processes. These stimuli have been widely used in brain activity investigations to study the influence of valence and/or arousal on visual processing. However, visual perception is strongly influenced by the physical properties of the images shown, especially their spatial frequency content, an aspect that has been unexpectedly neglected at large. In this study, we examine the complete set of IAPS with a discrete wavelet transform to highlight relations between the energy in different spatial frequencies and the emotional features of the pictures. Our results showed that these associations are weak when the complete dataset is considered, but for selected subsets of pictures, clear differences are present in both affective and spatial frequency content. The IAPS remains a powerful tool to explore emotional processing, but we strongly suggest that researchers use subsets of images that are controlled for the energy of their spatial frequencies when investigating emotional influence on visual processing.


European Journal of Neuroscience | 2004

Dissociable roles of the human somatosensory and superior temporal cortices for processing social face signals

Gilles Pourtois; David Sander; Michael Andres; Didier Maurice Grandjean; Lionel Reveret; Etienne Olivier; Patrik Vuilleumier

Faces are multi‐dimensional stimuli bearing important social signals, such as gaze direction and emotion expression. To test whether perception of these two facial attributes recruits distinct cortical areas within the right hemisphere, we used single‐pulse transcranial magnetic stimulation (TMS) in healthy volunteers while they performed two different tasks on the same face stimuli. In each task, two successive faces were presented with varying eye‐gaze directions and emotional expressions, separated by a short interval of random duration. TMS was applied over either the right somatosensory cortex or the right superior lateral temporal cortex, 100 or 200 ms after presentation of the second face stimulus. Participants performed a speeded matching task on the second face during one of two possible conditions, requiring judgements about either gaze direction or emotion expression (same/different as the first face). Our results reveal a significant task–stimulation site interaction, indicating a selective TMS‐related interference following stimulations of somatosensory cortex during the emotional expression task. Conversely, TMS of the superior lateral temporal cortex selectively interfered with the gaze direction task. We also found that the interference effect was specific to the stimulus content in each condition, affecting judgements of gaze shifts (not static eye positions) with TMS over the right superior temporal cortex, and judgements of fearful expressions (not happy expressions) with TMS over the right somatosensory cortex. These results provide for the first time a double dissociation in normal subjects during social face recognition, due to transient disruption of non‐overlapping brain regions. The present study supports a critical role of the somatosensory and superior lateral temporal regions in the perception of fear expression and gaze shift in seen faces, respectively.


Consciousness and Cognition | 2008

Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization

Didier Maurice Grandjean; David Sander; Klaus R. Scherer

In this paper we discuss the issue of the processes potentially underlying the emergence of emotional consciousness in the light of theoretical considerations and empirical evidence. First, we argue that componential emotion models, and specifically the Component Process Model (CPM), may be better able to account for the emergence of feelings than basic emotion or dimensional models. Second, we advance the hypothesis that consciousness of emotional reactions emerges when lower levels of processing are not sufficient to cope with the event and regulate the emotional process, particularly when the degree of synchronization between the components reaches a critical level and duration. Third, we review recent neuroscience evidence that bolsters our claim of the central importance of the synchronization of neuronal assemblies at different levels of processing.

Collaboration


Dive into the Didier Maurice Grandjean's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge