Nele Dael
University of Lausanne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Nele Dael.
Emotion | 2012
Nele Dael; Marcello Mortillaro; Klaus R. Scherer
Emotion communication research strongly focuses on the face and voice as expressive modalities, leaving the rest of the body relatively understudied. Contrary to the early assumption that body movement only indicates emotional intensity, recent studies have shown that body movement and posture also conveys emotion specific information. However, a deeper understanding of the underlying mechanisms is hampered by a lack of production studies informed by a theoretical framework. In this research we adopted the Body Action and Posture (BAP) coding system to examine the types and patterns of body movement that are employed by 10 professional actors to portray a set of 12 emotions. We investigated to what extent these expression patterns support explicit or implicit predictions from basic emotion theory, bidimensional theory, and componential appraisal theory. The overall results showed partial support for the different theoretical approaches. They revealed that several patterns of body movement systematically occur in portrayals of specific emotions, allowing emotion differentiation. Although a few emotions were prototypically expressed by one particular pattern, most emotions were variably expressed by multiple patterns, many of which can be explained as reflecting functional components of emotion such as modes of appraisal and action readiness. It is concluded that further work in this largely underdeveloped area should be guided by an appropriate theoretical framework to allow a more systematic design of experiments and clear hypothesis testing.
IEEE Transactions on Affective Computing | 2011
Donald Glowinski; Nele Dael; Antonio Camurri; Gualtiero Volpe; Marcello Mortillaro; Klaus R. Scherer
This paper presents a framework for analysis of affective behavior starting with a reduced amount of visual information related to human upper-body movements. The main goal is to individuate a minimal representation of emotional displays based on nonverbal gesture features. The GEMEP (Geneva multimodal emotion portrayals) corpus was used to validate this framework. Twelve emotions expressed by 10 actors form the selected data set of emotion portrayals. Visual tracking of trajectories of head and hands were performed from a frontal and a lateral view. Postural/shape and dynamic expressive gesture features were identified and analyzed. A feature reduction procedure was carried out, resulting in a 4D model of emotion expression that effectively classified/grouped emotions according to their valence (positive, negative) and arousal (high, low). These results show that emotionally relevant information can be detected/measured/obtained from the dynamic qualities of gesture. The framework was implemented as software modules (plug-ins) extending the EyesWeb XMI Expressive Gesture Processing Library and is going to be used in user centric, networked media applications, including future mobiles, characterized by low computational resources, and limited sensor systems.
computer vision and pattern recognition | 2008
Donald Glowinski; Antonio Camurri; Gualtiero Volpe; Nele Dael; Klaus R. Scherer
This paper illustrates our recent work on the analysis of expressive gesture related to the motion of the upper body (the head and the hands) in the context of emotional portrayals performed by professional actors. An experiment is presented which is the result of a multidisciplinary joint work. The experiment aims at (i) developing models and algorithms for analysis of such expressive content (ii) individuating which motion cues are involved in conveying the actorpsilas expressive intentions to portray four emotions (anger, joy, relief, sadness) via a scenario approach. The paper discusses the experiment in detail with reference to related conceptual issues, developed techniques, and the obtained results.
Quarterly Journal of Experimental Psychology | 2016
Nele Dael; Marie-Noëlle Perseguers; Cynthia Marchand; Jean-Philippe Antonietti; Christine Mohr
People associate affective meaning with colour, and this may influence decisions about colours. Hue is traditionally considered the most salient descriptor of colour and colour–affect associations, although colour brightness and saturation seem to have particularly strong affective connotations. To test whether colour choices can be driven by emotion, we investigated whether and how colour hue, brightness, and saturation are systematically associated with bodily expressions of positive (joy) and negative (fear) emotions. Twenty-five non-colour-blind participants viewed videos of these expressions and selected for each video the most appropriate colour using colour sliders providing values for hue, brightness, and saturation. The overall colour choices were congruent with the expressed emotion—that is, participants selected brighter and more saturated colours for joy expressions than for fear expressions. Also, colours along the red–yellow spectrum were deemed more appropriate for joy expressions and cyan–bluish hues for fear expressions. The current study adds further support to the role of emotion in colour choices by (a) showing that emotional information is spontaneously used in an unconstrained choice setting, (b) extending to ecologically valid stimuli occurring in everyday encounters (dressed bodies), and (c) suggesting that all colour parameters are likely to be important when processing affective nonverbal person information, though not independently from each other.
PLOS ONE | 2016
Domicele Jonauskaite; Christine Mohr; Jean-Philippe Antonietti; Peter Mark Spiers; Betty Althaus; Selin Anil; Nele Dael
Humans like some colours and dislike others, but which particular colours and why remains to be understood. Empirical studies on colour preferences generally targeted most preferred colours, but rarely least preferred (disliked) colours. In addition, findings are often based on general colour preferences leaving open the question whether results generalise to specific objects. Here, 88 participants selected the colours they preferred most and least for three context conditions (general, interior walls, t-shirt) using a high-precision colour picker. Participants also indicated whether they associated their colour choice to a valenced object or concept. The chosen colours varied widely between individuals and contexts and so did the reasons for their choices. Consistent patterns also emerged, as most preferred colours in general were more chromatic, while for walls they were lighter and for t-shirts they were darker and less chromatic compared to least preferred colours. This meant that general colour preferences could not explain object specific colour preferences. Measures of the selection process further revealed that, compared to most preferred colours, least preferred colours were chosen more quickly and were less often linked to valenced objects or concepts. The high intra- and inter-individual variability in this and previous reports furthers our understanding that colour preferences are determined by subjective experiences and that most and least preferred colours are not processed equally.
Frontiers in Psychology | 2013
Nele Dael; Guillaume Sierro; Christine Mohr
The literature on developmental synesthesia has seen numerous sensory combinations, with surprisingly few reports on synesthesias involving affect. On the one hand, emotion, or more broadly affect, might be of minor importance to the synesthetic experience (e.g., Sinke et al., 2012). On the other hand, predictions on how affect could be relevant to the synesthetic experience remain to be formulated, in particular those that are driven by emotion theories. In this theoretical paper, we hypothesize that a priori studies on synesthesia involving affect will observe the following. Firstly, the synesthetic experience is not merely about discrete emotion processing or overall valence (positive, negative) but is determined by or even altered through cognitive appraisal processes. Secondly, the synesthetic experience changes temporarily on a quantitative level according to (i) the affective appraisal of the inducing stimulus or (ii) the current affective state of the individual. These hypotheses are inferred from previous theoretical and empirical accounts on synesthesia (including the few examples involving affect), different emotion theories, crossmodal processing accounts in synesthetes, and non-synesthetes, and the presumed stability of the synesthetic experience. We hope that the current review will succeed in launching a new series of studies on “affective synesthesias.” We particularly hope that such studies will apply the same creativity in experimental paradigms as we have seen and still see when assessing and evaluating “traditional” synesthesias.
affective computing and intelligent interaction | 2015
Donald Glowinski; Marcello Mortillaro; Klaus R. Scherer; Nele Dael; Gualtiero Volpe Antonio Camurri
How efficiently decoding affective information when computational resources and sensor systems are limited? This paper presents a framework for analysis of affective behavior starting with a reduced amount of visual information related to human upper-body movements. The main goal is to individuate a minimal representation of emotional displays based on non-verbal gesture features. The GEMEP (Geneva multimodal emotion portrayals) corpus was used to validate this framework. Twelve emotions expressed by ten actors form the selected data set of emotion portrayals. Visual tracking of trajectories of head and hands was performed from a frontal and a lateral view. Postural/shape and dynamic expressive gesture features were identified and analyzed. A feature reduction procedure was carried out, resulting in a four-dimensional model of emotion expression, that effectively classified/grouped emotions according to their valence (positive, negative) and arousal (high, low). These results show that emotionally relevant information can be detected/measured/obtained from the dynamic qualities of gesture. The framework was implemented as software modules (plug-ins) extending the EyesWeb XMI Expressive Gesture Processing Library and was tested as a component for a multimodal search engine in collaboration with Google within the EU-ICT I-SEARCH project.
Psychological Research-psychologische Forschung | 2018
Domicele Jonauskaite; Nele Dael; C. Alejandro Parraga; Laetitia Chèvre; Alejandro García Sánchez; Christine Mohr
In 2015, a picture of a Dress (henceforth the Dress) triggered popular and scientific interest; some reported seeing the Dress in white and gold (W&G) and others in blue and black (B&B). We aimed to describe the phenomenon and investigate the role of contextualization. Few days after the Dress had appeared on the Internet, we projected it to 240 students on two large screens in the classroom. Participants reported seeing the Dress in B&B (48%), W&G (38%), or blue and brown (B&Br; 7%). Amongst numerous socio-demographic variables, we only observed that W&G viewers were most likely to have always seen the Dress as W&G. In the laboratory, we tested how much contextual information is necessary for the phenomenon to occur. Fifty-seven participants selected colours most precisely matching predominant colours of parts or the full Dress. We presented, in this order, small squares (a), vertical strips (b), and the full Dress (c). We found that (1) B&B, B&Br, and W&G viewers had selected colours differing in lightness and chroma levels for contextualized images only (b, c conditions) and hue for fully contextualized condition only (c) and (2) B&B viewers selected colours most closely matching displayed colours of the Dress. Thus, the Dress phenomenon emerges due to inter-individual differences in subjectively perceived lightness, chroma, and hue, at least when all aspects of the picture need to be integrated. Our results support the previous conclusions that contextual information is key to colour perception; it should be important to understand how this actually happens.
Journal of Nonverbal Behavior | 2012
Nele Dael; Marcello Mortillaro; Klaus R. Scherer
Perception | 2013
Nele Dael; Martyn Goudbeek; Klaus R. Scherer