Sylvie Gibet
McGill University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sylvie Gibet.
Acta Acustica United With Acustica | 2010
Alexandre Bouënard; Marcelo M. Wanderley; Sylvie Gibet
In recent years, the control of virtual instruments or sound-synthesis processes by natural gestures has become an important research field, both for building new audio-visual tools and for exploring gesture-sound relationships. Such multimodal and interactive tools typically present two advantages, on the one hand they provide realistic virtual instruments whose response can be compared to existing musical instruments, and on the other hand they give the possibility to vary characteristics of natural gestures, while ensuring a certain coherence between gesture and sound parameters. In this paper, we present and evaluate a new framework for explicitly expressing the characteristics of natural percussion gestures used for modeling, controlling and finally synthesizing new percussion gestures. A preliminary analysis of pre-recorded gestures leads to the identification and evaluation of significant parameters using a classification approach. This analysis shows that a reduced-dimension representation of captured motion can be used to control a virtual character. Furthermore, the simulated gestures provide dynamical variables that can be used to control sound synthesis through a mapping-interaction process.
The Visual Computer | 2012
Alexandre Bouënard; Sylvie Gibet; Marcelo M. Wanderley
The ever growing use of virtual environments requires more and more engaging elements for enhancing user experiences. Specifically regarding sounding virtual environments, one promising option to achieve such realism and interactivity requirements is the use of virtual characters interacting with sounding objects. In this paper, we focus as a case study on virtual characters playing virtual music instruments. We address more specially the real-time motion control and interaction of virtual characters with their sounding environment for proposing engaging and compelling virtual music performances. Combining physics-based simulation with motion data is a recent approach to finely represent and modulate this motion-sound interaction, while keeping the realism and expressivity of the original captured motion. We propose a physically-enabled environment in which a virtual percussionist interacts with a physics-based sound synthesis algorithm. We introduce and extensively evaluate the Hybrid Inverse Motion Control (HIMC), a motion-driven hybrid control scheme dedicated to the synthesis of upper-body percussion movements. We also propose a physics-based sound synthesis model with which the virtual character can interact. Finally, we present an architecture offering an effective way to manage heterogenous data (motion and sound parameters) and feedback (visual and sound) that influence the resulting virtual percussion performances.
Computer Music Journal | 2011
Alexandre Bouënard; Marcelo M. Wanderley; Sylvie Gibet; Fabrice Marandola
The increasing availability of software for creating real-time simulations of musical instrument sounds allows for the design of new visual and sounding media. These past decades have especially focused on the control of real and virtual instruments by natural gestures. In this paper, we present and extensively evaluate a framework (Figure 1) for the control of virtual percussion instruments, by modeling and simulating virtual percussionists gestures. By positioning the virtual performer at the center of the gesture-sound synthesis system, we aim at providing original tools to analyze and synthesize instrumental gesture performances. Our physics-based approach for gesture simulation brings some insight into the effect of biomechanical parameters of the gesture on the instrumental performance. Simulating both gesture and sound by physical models leads also to a coherent and human-centered interaction and provides new ways of exploring the mapping between gesture and sound. The use of motion capture data enables the realistic synthesis of both pre-recorded and novel percussion sequences from the specification of gesture scores. Such scores involve motion editing techniques applied to simple beat attacks. We therefore propose an original gesture language based on the instrumental playing techniques. This language is characterized by expressivity, interactivity with the user, and the possibility to take into account co-articulation between gesture units. Finally, providing 3D visual rendering synchronized with sound rendering allows us to observe virtual performances to the light of real ones, and to qualitatively evaluate both pedagogical and compositional capabilities of such a system.
new interfaces for musical expression | 2008
Alexandre Bouënard; Sylvie Gibet; Marcelo M. Wanderley
ENACTIVE | 2008
Alexandre Bouënard; Marcelo M. Wanderley; Sylvie Gibet
international computer music conference | 2009
Alexandre Bouënard; Marcelo M. Wanderley; Sylvie Gibet
GW | 2009
Alexandre Bouënard; Marcelo M. Wanderley; Sylvie Gibet
computer animation and social agents | 2009
Sylvie Gibet; Marcelo M. Wanderley
Archive | 2009
Alexandre Bouënard; Sylvie Gibet; Marcelo M. Wanderley
Archive | 2009
Miguel Sales Dias; Sylvie Gibet; Marcelo M. Wanderley; Bastos Rafael