Frédéric Bevilacqua
Pierre-and-Marie-Curie University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Frédéric Bevilacqua.
Ksii Transactions on Internet and Information Systems | 2015
Baptiste Caramiaux; Nicola Montecchio; Atau Tanaka; Frédéric Bevilacqua
This article presents a gesture recognition/adaptation system for human--computer interaction applications that goes beyond activity classification and that, as a complement to gesture labeling, characterizes the movement execution. We describe a template-based recognition method that simultaneously aligns the input gesture to the templates using a Sequential Monte Carlo inference technique. Contrary to standard template-based methods based on dynamic programming, such as Dynamic Time Warping, the algorithm has an adaptation process that tracks gesture variation in real time. The method continuously updates, during execution of the gesture, the estimated parameters and recognition results, which offers key advantages for continuous human--machine interaction. The technique is evaluated in several different ways: Recognition and early recognition are evaluated on 2D onscreen pen gestures; adaptation is assessed on synthetic data; and both early recognition and adaptation are evaluated in a user study involving 3D free-space gestures. The method is robust to noise, and successfully adapts to parameter variation. Moreover, it performs recognition as well as or better than nonadapting offline template-based methods.
designing interactive systems | 2012
Sarah Fdili Alaoui; Baptiste Caramiaux; Marcos Serrano; Frédéric Bevilacqua
In this paper, we explore the use of movement qualities as interaction modality. The notion of movement qualities is widely used in dance practice and can be understood as how the movement is performed, independently of its specific trajectory in space. We implemented our approach in the context of an artistic installation called A light touch. This installation invites the participant to interact with a moving light spot reacting to the hand movement qualities. We conducted a user experiment that showed that such an interaction based on movement qualities tends to enhance the user experience favouring explorative and expressive usage.
acm multimedia | 2013
Jules Françoise; Norbert Schnell; Frédéric Bevilacqua
In this paper, we propose a multimodal approach to create the mapping between gesture and sound in interactive music systems. Specifically, we propose to use a multimodal HMM to conjointly model the gesture and sound parameters. Our approach is compatible with a learning method that allows users to define the gesture--sound relationships interactively. We describe an implementation of this method for the control of physical modeling sound synthesis. Our model is promising to capture expressive gesture variations while guaranteeing a consistent relationship between gesture and sound.
Computer Music Journal | 2014
Baptiste Caramiaux; Jules Françoise; Norbert Schnell; Frédéric Bevilacqua
Gesture-to-sound mapping is generally defined as the association between gestural and sound parameters. This article describes an approach that brings forward the perception–action loop as a fundamental design principle for gesture–sound mapping in digital music instrument. Our approach considers the processes of listening as the foundation—and the first step—in the design of action–sound relationships. In this design process, the relationship between action and sound is derived from actions that can be perceived in the sound. Building on previous work on listening modes and gestural descriptions, we propose to distinguish between three mapping strategies: instantaneous, temporal, and metaphorical. Our approach makes use of machine-learning techniques for building prototypes, from digital music instruments to interactive installations. Four different examples of scenarios and prototypes are described and discussed.
Ksii Transactions on Internet and Information Systems | 2014
Bruno Zamborlin; Frédéric Bevilacqua; Marco Gillies; Mark d'Inverno
This article presents Gesture Interaction DEsigner (GIDE), an innovative application for gesture recognition. Instead of recognizing gestures only after they have been entirely completed, as happens in classic gesture recognition systems, GIDE exploits the full potential of gestural interaction by tracking gestures continuously and synchronously, allowing users to both control the target application moment to moment and also receive immediate and synchronous feedback about system recognition states. By this means, they quickly learn how to interact with the system in order to develop better performances. Furthermore, rather than learning the predefined gestures of others, GIDE allows users to design their own gestures, making interaction more natural and also allowing the applications to be tailored by users specific needs. We describe our system that demonstrates these new qualities—that combine to provide fluid gesture interaction design—through evaluations with a range of performers and artists.
designing interactive systems | 2014
Jules Françoise; Sarah Fdili Alaoui; Thecla Schiphorst; Frédéric Bevilacqua
We investigate the use of interactive sound feedback for dance pedagogy based on the practice of vocalizing while moving. Our goal is to allow dancers to access a greater range of expressive movement qualities through vocalization. We propose a methodology for the sonification of Effort Factors, as defined in Laban Movement Analysis, based on vocalizations performed by movement experts. Based on the experiential outcomes of an exploratory workshop, we propose a set of design guidelines that can be applied to interactive sonification systems for learning to perform Laban Effort Factors in a dance pedagogy context.
Frontiers in Neuroscience | 2016
Frédéric Bevilacqua; Eric O. Boyer; Jules Françoise; Olivier Houix; Patrick Susini; Agnès Roby-Brami; Sylvain Hanneton
This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with auditory-feedback. In particular, we propose to distinguish between sound-oriented tasks and movement-oriented tasks in experiments involving interactive sound feedback. We describe several research questions and recently published results on movement control, learning and perception. In particular, we studied the effect of the auditory feedback on movements considering several cases: from experiments on pointing and visuo-motor tracking to more complex tasks where interactive sound feedback can guide movements, or cases of sensory substitution where the auditory feedback can inform on object shapes. We also developed specific methodologies and technologies for designing the sonic feedback and movement sonification. We conclude with a discussion on key future research challenges in sensori-motor learning with movement sonification. We also point out toward promising applications such as rehabilitation, sport training or product design.
human factors in computing systems | 2013
Sarah Fdili Alaoui; Christian Jacquemin; Frédéric Bevilacqua
Chiseling Bodies is an interactive augmented dance performance, where a dancer interacts with abstract visuals. They are massive mass-spring systems whose dynamical behaviors are echoing the dancers movement qualities.
Axmedis 2006 | 2006
Norbert Schnell; Frédéric Bevilacqua; Fabrice Guédy; Nicolas H. Rasamimanana; Diemo Schwarz
This article gives an overview over the support technology for the learning of musical instrument performance developed and assembled for the I-MAESTRO project and describes some of the developed components in further detail. The underlying paradigms related to the process of music teaching and learning as well as to the processing and representation of data captured from musical instrument performers are mentioned.
IEEE MultiMedia | 2015
Ana Tajadura-Jiménez; Nadia Bianchi-Berthouze; Enrico Furfaro; Frédéric Bevilacqua
The audio feedback resulting from object interaction provides information about the material of the surface and about ones own motor behavior. With the current developments in interactive sonification, its now possible to digitally change this audio feedback, making the use of interactive sonification a compelling approach to shape tactile surface interactions. Here, the authors present a prototype for a sonic interactive surface, capable of delivering surface tapping sounds in real time when triggered by users taps on a real surface or on an imagined virtual surface. In this system, the delivered audio feedback can be varied so that the tapping sounds correspond to different applied strengths during tapping. The authors also propose a multidimensional measurement approach to evaluate user experiences of multimodal interactive systems. They evaluated their system by looking at the effect of the altered tapping sounds on emotional action-related responses, the users interactions with the surface, and perceived surface hardness. Results show the influence of the sonification of tapping at all levels: emotional, behavioral, and perceptual. These results have implications on the design of interactive sonification displays and tangible auditory interfaces aiming to change perceived and subsequent motor behavior as well as perceived material properties.