Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nicolas H. Rasamimanana is active.

Publication


Featured researches published by Nicolas H. Rasamimanana.


GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction | 2009

Continuous realtime gesture following and recognition

Frédéric Bevilacqua; Bruno Zamborlin; Anthony Sypniewski; Norbert Schnell; Fabrice Guédy; Nicolas H. Rasamimanana

We present a HMM based system for real-time gesture analysis. The system outputs continuously parameters relative to the gesture time progression and its likelihood. These parameters are computed by comparing the performed gesture with stored reference gestures. The method relies on a detailed modeling of multidimensional temporal curves. Compared to standard HMM systems, the learning procedure is simplified using prior knowledge allowing the system to use a single example for each class. Several applications have been developed using this system in the context of music education, music and dance performances and interactive installation. Typically, the estimation of the time progression allows for the synchronization of physical gestures to sound files by time stretching/compressing audio buffers or videos.


GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation | 2005

Gesture analysis of violin bow strokes

Nicolas H. Rasamimanana; Emmanuel Fléty; Frédéric Bevilacqua

We developed an augmented violin, i.e. an acoustic instrument with added gesture capture capabilities to control electronic processes. We report here gesture analysis we performed on three different bow strokes, Detache, Martele and Spiccato, using this augmented violin. Different features based on velocity and acceleration were considered. A linear discriminant analysis has been performed to estimate a minimum number of pertinent features necessary to model these bow stroke classes. We found that the maximum and minimum accelerations of a given stroke were efficient to parameterize the different bow stroke types, as well as differences in dynamics playing. Recognition rates were estimated using a kNN method with various training sets. We finally discuss that bow stroke recognition allows to relate the gesture data to music notation, while a bow stroke continuous parameterization can be related to continuous sound characteristics.


tangible and embedded interaction | 2011

Modular musical objects towards embodied control of digital music

Nicolas H. Rasamimanana; Frédéric Bevilacqua; Norbert Schnell; Fabrice Guédy; Emmanuel Fléty; Côme Maestracci; Bruno Zamborlin; Jean-Louis Frechin; Uros Petrevski

We present an ensemble of tangible objects and software modules designed for musical interaction and performance. The tangible interfaces form an ensemble of connected objects communicating wirelessly. A central concept is to let users determine the final musical function of the objects, favoring customization, assembling, repurposing. This might imply assembling the wireless interfaces with existing everyday objects or musical instruments. Moreover, gesture analysis and recognition modules allow the users to define their own action/motion for the control of sound parameters. Various sound engines and interaction scenarios were built and experimented. Some examples that were developed in a music pedagogy context are described.


Archive | 2011

Online Gesture Analysis and Control of Audio Processing

Frédéric Bevilacqua; Norbert Schnell; Nicolas H. Rasamimanana; Bruno Zamborlin; Fabrice Guédy

1 Abstract. This chapter presents a general framework for gesture-controlled audio processing. The gesture parameters are assumed to be multi-dimensional temporal profiles obtained from movement or sound capture systems. The analysis is based on machine learning techniques, comparing the incoming dataflow with stored templates. The mapping procedures between the gesture and the audio processing include a specific method we called temporal mapping. In this case, the temporal evolution of the gesture input is taken into account in the mapping process. We describe an example of a possible use of the framework that we experimented with in various contexts, including music and dance performances, music pedagogy and installations.


Journal of New Music Research | 2008

Effort-based Analysis of Bowing Movements: Evidence of Anticipation Effects

Nicolas H. Rasamimanana; Frédéric Bevilacqua

Abstract Anticipatory behaviours are known to occur in music performance, notably on the control movements of instruments such as piano or drums. We studied such effects on bowed string movements, corresponding to a case where the control on sound is continuous. Movements were measured with an optical motion capture system combined with sensors on the bow. Bowing movements were analysed and compared on the basis of underlying effort costs, determined from their velocity profiles. Precisely, we used movement models that assume that jerk or impulse are minimized. These models were synthesized based on measurement data and then compared to velocity and acceleration profiles. Results on various musical cases involving separate strokes, scales, mixed bowing techniques and rhythms showed that this methodology can account, to some extent, for the different effort strategies used by the players. The presented modelling provides evidence of anticipatory behaviour during bowing movements.


human factors in computing systems | 2012

The urban musical game: using sport balls as musical interfaces

Nicolas H. Rasamimanana; Frédéric Bevilacqua; Julien Bloit; Norbert Schnell; Emmanuel Fléty; Andrea Cera; Uros Petrevski; Jean-Louis Frechin

We present Urban Musical Game, an installation using augmented sports balls to manipulate and transform an interactive music environment. The interaction is based on playing techniques, a concept borrowed from traditional music instruments and applied here to non musical objects.


Gesture-Based Human-Computer Interaction and Simulation | 2009

String Bowing Gestures at Varying Bow Stroke Frequencies: A Case Study

Nicolas H. Rasamimanana; Delphine Bernardin; Marcelo M. Wanderley; Frédéric Bevilacqua

The understanding of different bowing strategies can provide key concepts for the modelling of music performance. We report here an exploratory study of bowing gestures for a viola player and a violin player in the case of bow strokes performed at different frequencies. Bow and arm movements as well as bow pressure on strings were measured respectively with a 3D optical motion capture system and a custom pressure sensor. While increasing bow stroke frequency, defined as the inverse time between two strokes, players did use different bowing movements as indicated from the measurement of bow velocity and arm joint angles. First, bow velocity profiles abruptly shift from a rectangle shape to a sinus shape. Second, while bow velocity is sinusoidal, an additional change is observed: the wrist and elbow relative phase shifts from out-of-phase to in-phase at the highest frequencies, indicating a possible change in the players coordinative pattern. We finally discuss the fact that only small differences are found in the sound while significant changes occur in the velocity / acceleration profiles.


Godoy, Rolf Inge & Leman, Marc (Eds.). (2009). Musical gestures : sound, movement, and meaning. . London: Routledge, pp. 36-68 | 2009

Gestures in performance

Sofia Dahl; Frédéric Bevilacqua; Roberto Bresin; Martin Clayton; Isabella Poggi; Nicolas H. Rasamimanana

We experience and understand the world, including music, through body movement–when we hear something, we are able to make sense of it by relating it to our body movements, or form an image in our minds of body movements. Musical Gestures is a collection of essays that explore the relationship between sound and movement. It takes an interdisciplinary approach to the fundamental issues of this subject, drawing on ideas, theories and methods from disciplines such as musicology, music perception, human movement science, cognitive psychology, and computer science.


Organised Sound | 2009

Perspectives on gesture–sound relationships informed from acoustic instrument studies

Nicolas H. Rasamimanana; Florian Kaiser; Frédéric Bevilacqua

We present an experimental study on articulation in bowed strings that provides important elements for a discussion about sound synthesis control. The study focuses on bow acceleration profiles and transient noises, measured for different players for the bowing techniques detache and martele. We found that maximum of these profiles are not synchronous, and temporal shifts are dependent on the bowing techniques. These results allow us to bring out important mechanisms in sound and gesture articulation. In particular, the results reveal a potential shortcoming of mapping strategies using simple frame-by-frame data-stream procedures. We propose instead to consider input control data as time functions, and consider gesture co-articulation processes.


Axmedis 2006 | 2006

Technologyu and Paradigms to Support the Learning of Music Performance

Norbert Schnell; Frédéric Bevilacqua; Fabrice Guédy; Nicolas H. Rasamimanana; Diemo Schwarz

This article gives an overview over the support technology for the learning of musical instrument performance developed and assembled for the I-MAESTRO project and describes some of the developed components in further detail. The underlying paradigms related to the process of music teaching and learning as well as to the processing and representation of data captured from musical instrument performers are mentioned.

Collaboration


Dive into the Nicolas H. Rasamimanana's collaboration.

Researchain Logo
Decentralizing Knowledge