Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marc Mehu is active.

Publication


Featured researches published by Marc Mehu.


ieee international conference on automatic face gesture recognition | 2011

The first facial expression recognition and analysis challenge

Michel F. Valstar; Bihan Jiang; Marc Mehu; Maja Pantic; Klaus R. Scherer

Automatic Facial Expression Recognition and Analysis, in particular FACS Action Unit (AU) detection and discrete emotion detection, has been an active topic in computer science for over two decades. Standardisation and comparability has come some way; for instance, there exist a number of commonly used facial expression databases. However, lack of a common evaluation protocol and lack of sufficient details to reproduce the reported individual results make it difficult to compare systems to each other. This in turn hinders the progress of the field. A periodical challenge in Facial Expression Recognition and Analysis would allow this comparison in a fair manner. It would clarify how far the field has come, and would allow us to identify new goals, challenges and targets. In this paper we present the first challenge in automatic recognition of facial expressions to be held during the IEEE conference on Face and Gesture Recognition 2011, in Santa Barbara, California. Two sub-challenges are defined: one on AU detection and another on discrete emotion detection. It outlines the evaluation protocol, the data used, and the results of a baseline method for the two sub-challenges.


systems man and cybernetics | 2012

Meta-Analysis of the First Facial Expression Recognition Challenge

Michel F. Valstar; Marc Mehu; Bihan Jiang; Maja Pantic; Klaus R. Scherer

Automatic facial expression recognition has been an active topic in computer science for over two decades, in particular facial action coding system action unit (AU) detection and classification of a number of discrete emotion states from facial expressive imagery. Standardization and comparability have received some attention; for instance, there exist a number of commonly used facial expression databases. However, lack of a commonly accepted evaluation protocol and, typically, lack of sufficient details needed to reproduce the reported individual results make it difficult to compare systems. This, in turn, hinders the progress of the field. A periodical challenge in facial expression recognition would allow such a comparison on a level playing field. It would provide an insight on how far the field has come and would allow researchers to identify new goals, challenges, and targets. This paper presents a meta-analysis of the first such challenge in automatic recognition of facial expressions, held during the IEEE conference on Face and Gesture Recognition 2011. It details the challenge data, evaluation protocol, and the results attained in two subchallenges: AU detection and classification of facial expression imagery in terms of a number of discrete emotion categories. We also summarize the lessons learned and reflect on the future of the field of facial expression recognition in general and on possible future challenges in particular.


ieee international conference on automatic face gesture recognition | 2015

FERA 2015 - second Facial Expression Recognition and Analysis challenge

Michel F. Valstar; Timur R. Almaev; Jeffrey M. Girard; Marc Mehu; Lijun Yin; Maja Pantic; Jeffrey F. Cohn

Despite efforts towards evaluation standards in facial expression analysis (e.g. FERA 2011), there is a need for up-to-date standardised evaluation procedures, focusing in particular on current challenges in the field. One of the challenges that is actively being addressed is the automatic estimation of expression intensities. To continue to provide a standardisation platform and to help the field progress beyond its current limitations, the FG 2015 Facial Expression Recognition and Analysis challenge (FERA 2015) will challenge participants to estimate FACS Action Unit (AU) intensity as well as AU occurrence on a common benchmark dataset with reliable manual annotations. Evaluation will be done using a clear and well-defined protocol. In this paper we present the second such challenge in automatic recognition of facial expressions, to be held in conjunction with the 11 IEEE conference on Face and Gesture Recognition, May 2015, in Ljubljana, Slovenia. Three sub-challenges are defined: the detection of AU occurrence, the estimation of AU intensity for pre-segmented data, and fully automatic AU intensity estimation. In this work we outline the evaluation protocol, the data used, and the results of a baseline method for the three sub-challenges.


Journal of Evolutionary Psychology | 2007

Duchenne smiles and the perception of generosity and sociability in faces

Marc Mehu; Anthony C. Little; R. I. M. Dunbar

Although Duchenne smiles have been shown to have a social signal value, there is limited evidence as to whether this effect generalises to most positive attributes, or whether it is restricted to a particular social domain. As opposed to non-Duchenne smiles, Duchenne smiles involve the activity of facial muscles in the eye region (orbicularis oculi). The hypothesis that Duchenne and non-Duchenne smiles produce different responses in receivers was tested in a face perception experiment. People were asked to rate neutral and smiling faces on ten attributes: attractiveness, generosity, trustworthiness, competitiveness, health, agreeableness, conscien- tiousness, extroversion, neuroticism, and openness to experience. Results showed that the type of smile had a stronger impact on the ratings of generosity and extroversion. The difference between neutral and smiling was larger when faces showed a Duchenne than a non-Duchenne smile, though the effect of smile type on attributions of generosity appeared to be restricted to male faces. Therefore the Duchenne marker shows some specificity to judgements of altruism and sociability.


affective computing and intelligent interaction | 2009

Spotting agreement and disagreement: A survey of nonverbal audiovisual cues and tools

Konstantinos Bousmalis; Marc Mehu; Maja Pantic

While detecting and interpreting temporal patterns of non-verbal behavioral cues in a given context is a natural and often unconscious process for humans, it remains a rather difficult task for computer systems. Nevertheless, it is an important one to achieve if the goal is to realise a naturalistic communication between humans and machines. Machines that are able to sense social attitudes like agreement and disagreement and respond to them in a meaningful way are likely to be welcomed by users due to the more natural, efficient and human-centered interaction they are bound to experience. This paper surveys the nonverbal cues that could be present during agreement and disagreement behavioural displays and lists a number of tools that could be useful in detecting them, as well as a few publicly available databases that could be used to train these tools for analysis of spontaneous, audiovisual instances of agreement and disagreement.


Social Psychological and Personality Science | 2011

Subtly Different Positive Emotions Can Be Distinguished by Their Facial Expressions

Marcello Mortillaro; Marc Mehu; Klaus R. Scherer

Positive emotions are crucial to social relationships and social interaction. Although smiling is a frequently studied facial action, investigations of positive emotional expressions are underrepresented in the literature. This may be partly because of the assumption that all positive emotions share the smile as a common signal but lack specific facial configurations. The present study investigated prototypical expressions of four positive emotions—interest, pride, pleasure, and joy. The Facial Action Coding System was used to microcode facial expression of representative samples of these emotions taken from the Geneva Multimodal Emotion Portrayal corpus. The data showed that the frequency and duration of several action units differed between emotions, indicating that actors did not use the same pattern of expression to encode them. The authors argue that an appraisal perspective is suitable to describe how subtly differentiated positive emotional states differ in their prototypical facial expressions.


Visual Analysis of Humans | 2011

Social Signal Processing: The Research Agenda

Maja Pantic; Roderick Cowie; Francesca D'Errico; Dirk Heylen; Marc Mehu; Catherine Pelachaud; Isabella Poggi; Marc Schroeder; Alessandro Vinciarelli

The exploration of how we react to the world and interact with it and each other remains one of the greatest scientific challenges. Latest research trends in cognitive sciences argue that our common view of intelligence is too narrow, ignoring a crucial range of abilities that matter immensely for how people do in life. This range of abilities is called social intelligence and includes the ability to express and recognise social signals produced during social interactions like agreement, politeness, empathy, friendliness, conflict, etc., coupled with the ability to manage them in order to get along well with others while winning their cooperation. Social Signal Processing (SSP) is the new research domain that aims at understanding and modelling social interactions (human-science goals), and at providing computers with similar abilities in human-computer interaction scenarios (technological goals). SSP is in its infancy, and the journey towards artificial social intelligence and socially-aware computing is still long. This research agenda is a twofold, a discussion about how the field is understood by people who are currently active in it and a discussion about issues that the researchers in this formative field face.


Cognitive Processing | 2012

A psycho-ethological approach to social signal processing

Marc Mehu; Klaus R. Scherer

The emerging field of social signal processing can benefit from a theoretical framework to guide future research activities. The present article aims at drawing attention to two areas of research that devoted considerable efforts to the understanding of social behaviour: ethology and social psychology. With a long tradition in the study of animal signals, ethology and evolutionary biology have developed theoretical concepts to account for the functional significance of signalling. For example, the consideration of divergent selective pressures responsible for the evolution of signalling and social cognition emphasized the importance of two classes of indicators: informative cues and communicative signals. Social psychology, on the other hand, investigates emotional expression and interpersonal relationships, with a focus on the mechanisms underlying the production and interpretation of social signals and cues. Based on the theoretical considerations developed in these two fields, we propose a model that integrates the processing of perceivable individual features (social signals and cues) with contextual information, and we suggest that output of computer-based processing systems should be derived in terms of functional significance rather than in terms of absolute conceptual meaning.


Folia Primatologica | 2008

Relationship between Smiling and Laughter in Humans (Homo sapiens) : Testing the Power Asymmetry Hypothesis

Marc Mehu; R. I. M. Dunbar

The power asymmetry hypothesis claims that individuals should have distinct signals of appeasement/affiliation and play when status difference is high, whereas these signals should overlap in egalitarian interactions. Naturalistic observations were conducted on humans interacting in groups that differed in terms of age composition (and presumably social status). Three affiliative behaviours were recorded by focal sampling: spontaneous smiles, deliberate smiles and laughter. Interestingly, young men showed significantly higher proportions of deliberate smiles in comparison to laughter when interacting with people of a different age class than when interacting in same-age groups. The pattern of affiliative behaviours in women remained unaffected by the age composition of groups. This partly supports the power asymmetry hypothesis and suggests that in men, deliberate smiles could play a role in the regulation of hierarchical relationships.


Emotion Review | 2013

Understanding the Mechanisms Underlying the Production of Facial Expression of Emotion: A Componential Perspective

Klaus R. Scherer; Marcello Mortillaro; Marc Mehu

We highlight the need to focus on the underlying determinants and production mechanisms to fully understand the nature of facial expression of emotion and to settle the theoretical debate about the meaning of motor expression. Although emotion theorists have generally remained rather vague about the details of the process, this has been a central concern of componential appraisal theories. We describe the fundamental assumptions and predictions of this approach regarding the patterning of facial expressions for different emotions. We also review recent evidence for the assumption that specific facial muscle movements may be reliable symptoms of certain appraisal outcomes and that facial expressions unfold over time on the basis of a sequence of appraisal check results.

Collaboration


Dive into the Marc Mehu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Maja Pantic

Imperial College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pia Nystrom

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge