Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Guillaume Chanel is active.

Publication


Featured researches published by Guillaume Chanel.


acm multimedia | 2006

Emotion assessment: arousal evaluation using EEG's and peripheral physiological signals

Guillaume Chanel; Julien Kronegg; Didier Maurice Grandjean; Thierry Pun

The arousal dimension of human emotions is assessed from two different physiological sources: peripheral signals and electroencephalographic (EEG) signals from the brain. A complete acquisition protocol is presented to build a physiological emotional database for real participants. Arousal assessment is then formulated as a classification problem, with classes corresponding to 2 or 3 degrees of arousal. The performance of 2 classifiers has been evaluated, on peripheral signals, on EEGs, and on both. Results confirm the possibility of using EEGs to assess the arousal component of emotion, and the interest of multimodal fusion between EEGs and peripheral physiological signals.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2009

Short-term emotion assessment in a recall paradigm

Guillaume Chanel; Joep Johannes Maria Kierkels; Mohammad Soleymani; Thierry Pun

The work presented in this paper aims at assessing human emotions using peripheral as well as electroencephalographic (EEG) physiological signals on short-time periods. Three specific areas of the valence-arousal emotional space are defined, corresponding to negatively excited, positively excited, and calm-neutral states. An acquisition protocol based on the recall of past emotional life episodes has been designed to acquire data from both peripheral and EEG signals. Pattern classification is used to distinguish between the three areas of the valence-arousal space. The performance of several classifiers has been evaluated on 10 participants and different feature sets: peripheral features, EEG time-frequency features, EEG pairwise mutual information (MI) features. Comparison of results obtained using either peripheral or EEG signals confirms the interest of using EEGs to assess valence and arousal in emotion recall conditions. The obtained accuracy for the three emotional classes is 63% using EEG time-frequency features, which is better than the results obtained from previous studies using EEG and similar classes. Fusion of the different feature sets at the decision level using a summation rule also showed to improve accuracy to 70%. Furthermore, the rejection of non-confident samples finally led to a classification accuracy of 80% for the three classes.


systems man and cybernetics | 2011

Emotion Assessment From Physiological Signals for Adaptation of Game Difficulty

Guillaume Chanel; Cyril Rebetez; Mireille Bétrancourt; Thierry Pun

This paper proposes to maintain players engagement by adapting game difficulty according to players emotions assessed from physiological signals. The validity of this approach was first tested by analyzing the questionnaire responses, electroencephalogram (EEG) signals, and peripheral signals of the players playing a Tetris game at three difficulty levels. This analysis confirms that the different difficulty levels correspond to distinguishable emotions, and that, playing several times at the same difficulty level gives rise to boredom. The next step was to train several classifiers to automatically detect the three emotional classes from EEG and peripheral signals in a player-independent framework. By using either type of signals, the emotional classes were successfully recovered, with EEG having a better accuracy than peripheral signals on short periods of time. After the fusion of the two signal categories, the accuracy raised up to 63%.


Proceedings of the 12th international conference on Entertainment and media in the ubiquitous era | 2008

Boredom, engagement and anxiety as indicators for adaptation to difficulty in games

Guillaume Chanel; Cyril Rebetez; Mireille Bétrancourt; Thierry Pun

This paper proposes an approach based on emotion recognition to maintain engagement of players in a game by modulating the game difficulty. Physiological and questionnaire data were gathered from 20 players during and after playing a Tetris game at different difficulty levels. Both physiological and self-report analyses lead to the conclusion that playing at different levels gave rise to different emotional states and that playing at the same level of difficulty several times elicits boredom. Emotion assessment from physiological signals was performed using a SVM (Support Vector Machine). An accuracy of 53.33% was obtained on the discrimination of three emotional classes, namely boredom, anxiety, engagement.


systems, man and cybernetics | 2007

Valence-arousal evaluation using physiological signals in an emotion recall paradigm

Guillaume Chanel; Karim Ansari-Asl; Thierry Pun

The work presented in this paper aims at assessing human emotions using peripheral as well as electroencephalographic (EEG) physiological signals. Three specific areas of the valence-arousal emotional space are defined, corresponding to negatively excited, positively excited, and calm-neutral states. An acquisition protocol based on the recall of past emotional events has been designed to acquire data from both peripheral and EEG signals. Pattern classification is used to distinguish between the three areas of the valence-arousal space. The performance of two classifiers has been evaluated on different features sets: peripheral data, EEG data, and EEG data with prior feature selection. Comparison of results obtained using either peripheral or EEG signals confirms the interest of using EEGs to assess valence and arousal in emotion recall conditions.


international symposium on multimedia | 2008

Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User's Physiological Emotional Responses

Mohammad Soleymani; Guillaume Chanel; Joep Johannes Maria Kierkels; Thierry Pun

In this paper, we propose an approach for affective representation of movie scenes based on the emotions that are actually felt by spectators. Such a representation can be used for characterizing the emotional content of video clips for e.g. affective video indexing and retrieval, neuromarketing studies, etc. A dataset of 64 different scenes from eight movies was shown to eight participants. While watching these clips, their physiological responses were recorded. The participants were also asked to self-assess their felt emotional arousal and valence for each scene. In addition, content-based audio- and video-based features were extracted from the movie scenes in order to characterize each one. Degrees of arousal and valence were estimated by a linear combination of features from physiological signals, as well as by a linear combination of content-based features. We showed that a significant correlation exists between arousal/valence provided by the spectators self-assessments, and affective grades obtained automatically from either physiological responses or from audio-video features. This demonstrates the ability of using multimedia features and physiological responses to predict the expected affect of the user in response to the emotional video content.


Brain-Computer Interfaces | 2014

A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges

Christian Mühl; Brendan Z. Allison; Anton Nijholt; Guillaume Chanel

Affective states, moods and emotions, are an integral part of human nature: they shape our thoughts, govern the behavior of the individual, and influence our interpersonal relationships. The last decades have seen a growing interest in the automatic detection of such states from voice, facial expression, and physiological signals, primarily with the goal of enhancing human-computer interaction with an affective component. With the advent of brain-computer interface research, the idea of affective brain-computer interfaces (aBCI), enabling affect detection from brain signals, arose. In this article, we set out to survey the field of neurophysiology-based affect detection. We outline possible applications of aBCI in a general taxonomy of brain-computer interface approaches and introduce the core concepts of affect and their neurophysiological fundamentals. We show that there is a growing body of literature that evidences the capabilities, but also the limitations and challenges of affect detection from neurophysiological activity.


Proceedings of the 2nd ACM workshop on Multimedia semantics | 2008

Affective ranking of movie scenes using physiological signals and content analysis

Mohammad Soleymani; Guillaume Chanel; Joep Johannes Maria Kierkels; Thierry Pun

In this paper, we propose an approach for affective ranking of movie scenes based on the emotions that are actually felt by spectators. Such a ranking can be used for characterizing the affective, or emotional, content of video clips. The ranking can for instance help determine which video clip from a database elicits, for a given user, the most joy. This in turn will permit video indexing and retrieval based on affective criteria corresponding to a personalized user affective profile. A dataset of 64 different scenes from 8 movies was shown to eight participants. While watching, their physiological responses were recorded; namely, five peripheral physiological signals (GSR - galvanic skin resistance, EMG - electromyograms, blood pressure, respiration pattern, skin temperature) were acquired. After watching each scene, the participants were asked to self-assess their felt arousal and valence for that scene. In addition, movie scenes were analyzed in order to characterize each with various audio- and video-based features capturing the key elements of the events occurring within that scene. Arousal and valence levels were estimated by a linear combination of features from physiological signals, as well as by a linear combination of content-based audio and video features. We show that a correlation exists between arousal- and valence-based rankings provided by the spectators self-assessments, and rankings obtained automatically from either physiological signals or audio-video features. This demonstrates the ability of using physiological responses of participants to characterize video scenes and to rank them according to their emotional content. This further shows that audio-visual features, either individually or combined, can fairly reliably be used to predict the spectators felt emotion for a given scene. The results also confirm that participants exhibit different affective responses to movie scenes, which emphasizes the need for the emotional profiles to be user-dependant.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2007

EEG-Based Synchronized Brain-Computer Interfaces: A Model for Optimizing the Number of Mental Tasks

Julien Kronegg; Guillaume Chanel; Sviatoslav Voloshynovskiy; Thierry Pun

The information-transfer rate (ITR) is commonly used to assess the performance of brain-computer interfaces (BCIs). Various studies have shown that the optimal number of mental tasks to be used is fairly low, around 3 or 4. We propose an experimental validation as well as a formal approach to demonstrate and confirm that this optimum is user and BCI design dependent. Even if increasing the number of mental tasks to the optimum indeed leads to an increase of the ITR, the gain remains small. This might not justify the added complexity in terms of protocol design


affective computing and intelligent interaction | 2009

A Bayesian framework for video affective representation

Mohammad Soleymani; Joep Johannes Maria Kierkels; Guillaume Chanel; Thierry Pun

Emotions that are elicited in response to a video scene contain valuable information for multimedia tagging and indexing. The novelty of this paper is to introduce a Bayesian classification framework for affective video tagging that allows taking contextual information into account. A set of 21 full length movies was first segmented and informative content-based features were extracted from each shot and scene. Shots were then emotionally annotated, providing ground truth affect. The arousal of shots was computed using a linear regression on the content-based features. Bayesian classification based on the shots arousal and content-based features allowed tagging these scenes into three affective classes, namely calm, positive excited and negative excited. To improve classification accuracy, two contextual priors have been proposed: the movie genre prior, and the temporal dimension prior consisting of the probability of transition between emotions in consecutive scenes. The f1 classification measure of 54.9% that was obtained on three emotional classes with a naïve Bayes classifier was improved to 63.4% after utilizing all the priors.

Collaboration


Dive into the Guillaume Chanel's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gaëlle Molinari

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge