Eric O. Boyer
IRCAM
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Eric O. Boyer.
Frontiers in Computational Neuroscience | 2013
Eric O. Boyer; Bénédicte Maria Babayan; Frédéric Bevilacqua; Markus Noisternig; Olivier Warusfel; Agnès Roby-Brami; Sylvain Hanneton; Isabelle Viaud-Delmon
Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed toward unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space.
Frontiers in Neuroscience | 2016
Frédéric Bevilacqua; Eric O. Boyer; Jules Françoise; Olivier Houix; Patrick Susini; Agnès Roby-Brami; Sylvain Hanneton
This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with auditory-feedback. In particular, we propose to distinguish between sound-oriented tasks and movement-oriented tasks in experiments involving interactive sound feedback. We describe several research questions and recently published results on movement control, learning and perception. In particular, we studied the effect of the auditory feedback on movements considering several cases: from experiments on pointing and visuo-motor tracking to more complex tasks where interactive sound feedback can guide movements, or cases of sensory substitution where the auditory feedback can inform on object shapes. We also developed specific methodologies and technologies for designing the sonic feedback and movement sonification. We conclude with a discussion on key future research challenges in sensori-motor learning with movement sonification. We also point out toward promising applications such as rehabilitation, sport training or product design.
computer music modeling and retrieval | 2013
Eric O. Boyer; Quentin Pyanet; Sylvain Hanneton; Frédéric Bevilacqua
This study introduces an experiment designed to analyze the sensorimotor adaptation to a motion-based sound synthesis system. We investigated a sound-oriented learning task, namely to reproduce a targeted sound. The motion of a small handheld object was used to control a sound synthesizer. The object angular velocity was measured by a gyroscope and transmitted in real time wirelessly to the sound system. The targeted sound was reached when the motion matched a given reference angular velocity profile with a given accuracy. An incorrect velocity profile produced either a noisier sound or a sound with a louder high harmonic, depending on the sign of the velocity error. The results showed that the participants were generally able to learn to reproduce sounds very close to the targeted sound. A corresponding motor adaptation was also found to occur, at various degrees, in most of the participants when the profile is altered.
human factors in computing systems | 2013
Frédéric Bevilacqua; Norbert Schnell; Nicolas H. Rasamimanana; Julien Bloit; Emmanuel Fléty; Baptiste Caramiaux; Jules Françoise; Eric O. Boyer
The Modular Musical Objects (MO) are an ensemble of tangible interfaces and software modules for creating novel musical instruments or for augmenting objects with sound. In particular, the MOs allow for designing action-sound relationships and behaviors based on the interaction with tangible objects or free body movements. Such interaction scenarios can be inspired by the affordances of particular objects (e.g. a ball, a table), by interaction metaphors based on the playing techniques of musical instruments or games. We describe specific examples of action-sound relationships that are made possible by the MO software modules and which take advantage of machine learning techniques.
Experimental Brain Research | 2017
Eric O. Boyer; Frédéric Bevilacqua; Patrick Susini; Sylvain Hanneton
The use of continuous auditory feedback for motor control and learning is still understudied and deserves more attention regarding fundamental mechanisms and applications. This paper presents the results of three experiments studying the contribution of task-, error-, and user-related sonification to visuo-manual tracking and assessing its benefits on sensorimotor learning. First results show that sonification can help decreasing the tracking error, as well as increasing the energy in participant’s movement. In the second experiment, when alternating feedback presence, the user-related sonification did not show feedback dependency effects, contrary to the error and task-related feedback. In the third experiment, a reduced exposure of 50% diminished the positive effect of sonification on performance, whereas the increase of the average energy with sound was still significant. In a retention test performed on the next day without auditory feedback, movement energy was still superior for the groups previously trained with the feedback. Although performance was not affected by sound, a learning effect was measurable in both sessions and the user-related group improved its performance also in the retention test. These results confirm that a continuous auditory feedback can be beneficial for movement training and also show an interesting effect of sonification on movement energy. User-related sonification can prevent feedback dependency and increase retention. Consequently, sonification of the user’s own motion appears as a promising solution to support movement learning with interactive feedback.
ieee virtual reality conference | 2015
Eric O. Boyer; Lucyle Vandervoorde; Frédéric Bevilacqua; Sylvain Hanneton
In this study, we investigated the ability of blindfolded adults to discriminate between concave and convex auditory virtual surfaces. We used a Leap MotionTM device to measure the movements of the hand and fingers. Participants were asked to explore the space above the device with the palm of one hand and an auditory feedback was produced only when the palm was moving into the boundaries of the surface. In order to demonstrate that curvature direction was correctly perceived by our participants, we estimated their discrimination thresholds with a psychophysical staircase procedure. Two groups of participants were fed with two different sonification of the surface. Results showed that most of the participants were able to learn the task. The best results were obtained with an auditory feedback related to the component of the hand velocity tangential to the virtual surface. This work proposes a contribution to the introduction in virtual reality of auditory virtual objects.
ieee virtual reality conference | 2015
Eric O. Boyer; Lucyle Vandervoorde; Frédéric Bevilacqua; Sylvain Hanneton
This prospective study concerning the perception of audio virtual surfaces (AVSs) was inspired by two different research fields: sensory substitution and haptic and touch perception. We define an Audio Virtual Surface as a region of space that triggers sounds when the user touches it or moves into it. First, we describe an example of interactive setup using an AVS to simulate a sonic interaction with a virtual water tank. Then, we present an experiment designed to investigate the ability of blindfolded adults to discriminate between concave and convex AVSs using only the gesture-sound interaction. Two groups received different sound feedback, a static one indicating presence in the AVS, and a static+dynamic one (related to the component of the hand velocity tangential to the surface). In order to demonstrate that curvature direction was correctly perceived, we estimated their discrimination thresholds with a psychophysical staircase procedure. Results show that most of the participants were able to learn the task. The best results were obtained with the additional dynamic feedback. Gestural patterns emerged from the interaction, suggesting the use of auditory representations of the virtual object. This work proposes a contribution to the introduction in Virtual Reality of sonic interactions with auditory virtual objects. The setups we present raise new questions at both experimental (sensory substitution) and application levels (design of gesture-sound interaction for virtual reality).
International Journal of Table Tennis Sciences | 2012
Eric O. Boyer; Frédéric Bevilacqua; François Phal; Sylvain Hanneton; Ufr Staps
Archive | 2017
Frédéric Bevilacqua; Norbert Schnell; Jules Françoise; Eric O. Boyer; Diemo Schwarz; Baptiste Caramiaux
10th International Symposium on Computer Music Multidisciplinary Research (CMMR) Sound, Music and Motion | 2013
Eric O. Boyer; Quentin Pyanet; Sylvain Hanneton; Frédéric Bevilacqua