Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alexis Clay is active.

Publication


Featured researches published by Alexis Clay.


international symposium on mixed and augmented reality | 2012

Interactions and systems for augmenting a live dance performance

Alexis Clay; Nadine Couture; Laurence Nigay; Jean-Baptiste de la Rivière; Jean-Claude Martin; Matthieu Courgeon; Myriam Desainte-Catherine; Emmanuel Orvain; Vincent Girondel; Gaël Domengero

The context of this work is to develop, adapt and integrate augmented reality related tools to enhance the emotion involved in cultural performances. Part of the work was dedicated to augmenting a stage in a live performance, with dance as an application case. In this paper, we present a milestone of this work, an augmented dance show that brings together several tools and technologies that were developed over the projects lifetime. This is the result of mixing an artistic process with scientific research and development. This augmented show brings to stage issues from the research fields of Human-Machine Interaction (HMI) and Augmented Reality (AR). Virtual elements are added on stage (visual and audio) and the dancer is able to interact with them in real-time, using different interaction techniques. The originality of this work is threefold. Firstly, we propose a set of movement-based interaction techniques that can be used independently on stage or in another context. In this set, some techniques are direct, while others go through a high level of abstraction. Namely, we performed movement-based emotion recognition on the dancer, and used the recognized emotions to generate emotional music pieces and emotional poses for a humanoid robot. Secondly, those interaction techniques rely on various interconnected systems that can be reassembled. We hence propose an integrated, interactive system for augmenting a live performance, a context where system failure is not tolerated. The final system can be adapted following the artists preferences. Finally, those systems were validated through an on field experiment - the show itself - after which we gathered and analyzed the feedback from both the audience and the choreographer.


Proceedings of WinVR'09, the ASME/AFM 2009 World Conference on Innovative Virtual Reality - World Conference on Innovative Virtual Reality (WinVR'09) | 2009

Towards an Architecture Model for Emotion Recognition in Interactive Systems: Application to a Ballet Dance Show

Alexis Clay; Nadine Couture; Laurence Nigay

In the context of the very dynamic and challenging domain of affective computing, we adopt a software engineering point of view on emotion recognition in interactive systems. Our goal is threefold: first, developing an architecture model for emotion recognition. This architecture model emphasizes multimodality and reusability. Second, developing a prototype based on this architecture model. For this prototype we focus on gesturebased emotion recognition. And third, using this prototype for augmenting a ballet dance show.


arts and technology | 2009

Augmenting a Ballet Dance Show Using the Dancer's Emotion: Conducting Joint Research in Dance and Computer Science

Alexis Clay; Elric Delord; Nadine Couture; Gaël Domenger

We describe the joint research that we conduct in gesture-based emotion recognition and virtual augmentation of a stage, bridging together the fields of computer science and dance. After establishing a common ground for dialogue, we could conduct a research process that equally benefits both fields. As computer scientists, dance is a perfect application case. Dancer’s artistic creativity orient our research choices. As dancers, computer science provides new tools for creativity, and more importantly a new point of view that forces us to reconsider dance from its fundamentals. In this paper we hence describe our scientific work and its implications on dance. We provide an overview of our system to augment a ballet stage, taking a dancer’s emotion into account. To illustrate our work in both fields, we describe three events that mixed dance, emotion recognition and augmented reality.


symposium on spatial user interaction | 2013

Towards bi-manual 3D painting: generating virtual shapes with hands

Alexis Clay; Jean-Christophe Lombardo; Julien Conan; Nadine Couture

We aim at combining surface generation by hands with 3D painting in a large space, from 10 to ~200 m2 (for a stage setup). Our long-term goal is to phase 3D surface generation in choreography, in order to produce augmented dance shows where the dancer can draw elements in 3D (characters, sets) while dancing. We present two systems; a first system in a CAVE environment, and second system more adapted to a stage setup. A comparison of both systems is provided, and an exploratory user experiment was performed, both with laypersons and dancers.


l'interaction homme-machine | 2010

Reconnaissance d'Emotions: un point de vue interaction multimodale

Alexis Clay; Nadine Couture; Laurence Nigay

Analysis of emotion recognition is a young but maturing research field, for which there is an emerging need for engineering models and in particular design models. Addressing these engineering challenges of emotion recognition, we reuse and adapt results from the research field of multimodal interaction, since the expression of an emotion is intrinsically multimodal. In this paper, we refine the definition of an interaction modality for the case of passive emotion recognition. We also study the combination of modalities by applying the CARE properties. We highlight the benefits of our design model for emotion recognition.


Proceedings of the Ergonomie et Informatique Avancee Conference on | 2010

eMotion: un outil pour personnaliser la reconnaissance d'émotions

Alexis Clay; Nadine Couture; Laurence Nigay

A subjects emotional expression is of course influenced by his personality, but several other factors can play a role. His position, the constraints he undergoes, or the setting of his working space can all deeply impact emotional expression through movement. When performing an automatic emotion recognition, an evaluator must be able to set up some parameters for the recognition so as to adapt to the evaluation conditions. Current recognition system do not allow for such adaptability. In this paper we present the eMotion software for movement-based emotion recognition. eMotions software architecture allows easy integration of graphical widgets for parametering emotional features extraction to suit the task at hand.


international symposium on mixed and augmented reality | 2014

Integrating augmented reality to enhance expression, interaction & collaboration in live performances: A ballet dance case study

Alexis Clay; Gaël Domenger; Julien Conan; Axel Domenger; Nadine Couture


Le travail humain | 2012

Développement d'une plate-forme d'évaluation personnalisable et adaptable pour l'étude du comportement émotionnel en situation de multisollicitation

Régis Mollard; Marion Wolff; Nadine Couture; Alexis Clay


new interfaces for musical expression | 2012

Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation

Alexis Clay; Nadine Couture; Myriam Desainte-Catherine; Pierre-Henri Vulliard; Joseph Larralde; Elodie Decarsin


affective computing and intelligent interaction | 2009

Engineering affective computing: A unifying software architecture

Alexis Clay; Nadine Couture; Laurence Nigay

Collaboration


Dive into the Alexis Clay's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Laurence Nigay

Joseph Fourier University

View shared research outputs
Top Co-Authors

Avatar

Régis Mollard

Paris Descartes University

View shared research outputs
Top Co-Authors

Avatar

Marion Wolff

Paris Descartes University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matthieu Courgeon

Centre national de la recherche scientifique

View shared research outputs
Researchain Logo
Decentralizing Knowledge