Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Olivier Warusfel is active.

Publication


Featured researches published by Olivier Warusfel.


Frontiers in Computational Neuroscience | 2013

From ear to hand: the role of the auditory-motor loop in pointing to an auditory source.

Eric O. Boyer; Bénédicte Maria Babayan; Frédéric Bevilacqua; Markus Noisternig; Olivier Warusfel; Agnès Roby-Brami; Sylvain Hanneton; Isabelle Viaud-Delmon

Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed toward unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space.


Neuroscience Letters | 2006

Ventriloquism aftereffects occur in the rear hemisphere

Ludivine Sarlat; Olivier Warusfel; Isabelle Viaud-Delmon

After exposure to a consistent spatial disparity of auditory and visual stimuli, subjective localization of sound sources is usually shifted in the direction of the visual stimuli. This study investigates whether such aftereffects can be observed in humans after exposure to a conflicting bimodal stimulation in virtual reality and whether these aftereffects are confined to the trained locations. Fourteen subjects participated in an adaptation experiment, in which auditory stimuli were convolved with non-individual head-related transfer functions, delivered via headphones. First, we assessed the auditory localization of subjects in darkness. They indicated the perceived direction of a sound using an angular pointer. We then immersed the subjects in a virtual environment by means of a head-mounted display. They were asked to reproduce sequences of movements of virtual objects with a mouse click on the objects. However, we introduced a spatial disparity of 15 degrees between the visual event and the concurrent auditory stimulation. After 20 min of exposure, we tested the subjects again in total darkness to determine whether their auditory localization system had been modified by the conflicting visual signals. We observed a shift of subjective localization towards the left in both dorsal and frontal hemifields of the subject, mainly for auditory stimuli located in the right hemispace. This result suggests that interaural difference cues and monaural spectral cues were not equally adapted, and that visual stimuli mainly influence the processing of binaural directional cues of sound localization.


Virtual Reality | 2010

Auditory and visual 3D virtual reality therapy for chronic subjective tinnitus: theoretical framework

Alain Londero; Isabelle Viaud-Delmon; Alexis Baskind; Olivier Delerue; Stéphanie Bertet; Pierre Bonfils; Olivier Warusfel

It is estimated that ~10% of the adult population in developed countries is affected by subjective tinnitus. Physiopathology of subjective tinnitus remains incompletely explained. Nevertheless, subjective tinnitus is thought to result from hyperactivity and neuroplastic reorganization of cortical and subcortical networks following acoustic deafferentation induced by cochlear or auditory nerve damage. Involvement of both auditory and non-auditory central nervous pathways explains the conscious perception of tinnitus and also the potentially incapacitating discomfort experienced by some patients (sound hypersensitivity, sleep disorders, attention deficit, anxiety or depression). These clinical patterns are similar to those observed in chronic pain following amputation where conditioning techniques using virtual reality have been shown both to be theoretically interesting and effectively useful. This analogy led us to develop an innovative setup with dedicated auditory and visual 3D virtual reality environments in which unilateral subjective tinnitus sufferers are given the possibility to voluntarily manipulate an auditory and visual image of their tinnitus (tinnitus avatar). By doing so, the patients will be able to transfer their subjective auditory perception to the tinnitus avatar and to gain agency on this multimodal virtual percept they hear, see and spatially control. Repeated sessions of such virtual reality immersions are then supposed to contribute to tinnitus treatment by promoting cerebral plasticity. This paper describes the theoretical framework and setup adjustments required by this very first attempt to adapt virtual reality techniques to subjective tinnitus treatment. Therapeutic usefulness will be validated by a further controlled clinical trial.


Frontiers in Neuroscience | 2014

From ear to body: the auditory-motor loop in spatial cognition

Isabelle Viaud-Delmon; Olivier Warusfel

Spatial memory is mainly studied through the visual sensory modality: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space. A wireless tracking system (video infrared with passive markers) was used to send the coordinates of the subjects head to the sound rendering system. The rendering system used advanced HRTF-based synthesis of directional cues and room acoustic simulation for the auralization of a realistic acoustic environment. Participants were guided blindfolded in an experimental room. Their task was to explore a delimitated area in order to find a hidden auditory target, i.e., a sound that was only triggered when walking on a precise location of the area. The position of this target could be coded in relationship to auditory landmarks constantly rendered during the exploration of the area. The task was composed of a practice trial, 6 acquisition trials during which they had to memorize the localization of the target, and 4 test trials in which some aspects of the auditory scene were modified. The task ended with a probe trial in which the auditory target was removed. The configuration of searching paths allowed observing how auditory information was coded to memorize the position of the target. They suggested that space can be efficiently coded without visual information in normal sighted subjects. In conclusion, space representation can be based on sensorimotor and auditory cues only, providing another argument in favor of the hypothesis that the brain has access to a modality-invariant representation of external space.


Cyberpsychology, Behavior, and Social Networking | 2013

Auditory-Visual Virtual Reality as a Diagnostic and Therapeutic Tool for Cynophobia

Clara Suied; George Drettakis; Olivier Warusfel; Isabelle Viaud-Delmon

Traditionally, virtual reality (VR) exposure-based treatment concentrates primarily on the presentation of a high-fidelity visual experience. However, adequately combining the visual and the auditory experience provides a powerful tool to enhance sensory processing and modulate attention. We present the design and usability testing of an auditory-visual interactive environment for investigating VR exposure-based treatment for cynophobia. The specificity of our application involves 3D sound, allowing the presentation and spatial manipulations of a fearful stimulus in the auditory modality and in the visual modality. We conducted an evaluation test with 10 participants who fear dogs to assess the capacity of our auditory-visual virtual environment (VE) to generate fear reactions. The specific perceptual characteristics of the dog model that were implemented in the VE were highly arousing, suggesting that VR is a promising tool to treat cynophobia.


Acta Acustica United With Acustica | 2013

Investigation on localisation accuracy for first and higher order ambisonics reproduced sound sources

Stéphanie Bertet; Jérôme Daniel; Etienne Parizet; Olivier Warusfel

Ambisonics and higher order ambisonics (HOA) technologies aim at reproducing sound field either synthesised or previously recorded with dedicated microphones. Based on a spherical harmonic decomposition, the sound field is more precisely described when higher-order components are used. The presented study evaluated the perceptual and objective localisation accuracy of the sound field encoded with four microphones of order one to four and decoded over a ring of loudspeakers. A perceptual test showed an improvement of the localisation with higher order ambisonic microphones. Reproduced localisation indices were estimated for the four microphones and the respective synthetic systems of order one to four. The perceptual and objective analysis revealed the same conclusions. The localisation accuracy depends on the ambisonic order as well as the source incidence. Furthermore, impairments linked to the microphones were highlighted.


Journal of the Acoustical Society of America | 2008

Investigation on the restitution system influence over perceived Higher Order Ambisonics sound field: a subjective evaluation involving from first to fourth order systems

Stéphanie Bertet; Jérôme Daniel; Etienne Parizet; Olivier Warusfel

Among the spatial audio reproduction techniques over loudspeakers, the Higher Order Ambisonics (HOA) approach is based on a sound field spherical harmonics decomposition. By truncating the decomposition to the Mth order, it remains a finite number of components that form the spatial HOA format. The more components are used to encode the sound field, the finer the spatial resolution is. Similarly, the size of the area where the sound field is accurately recreated is proportional to the order. For an Mth encoding order, N=2M+2 equally distributed loudspeakers are recommended for a homogeneous reproduction in the horizontal plane. Adding loudspeakers does not change the spatial resolution. However, what is the influence of the restitution system on the perceived sound field? An experiment was designed in order to compare four systems (from first to fourth order) and a reference one, using similarity ratings obtained from pairwise comparisons. Two sound scenes were used, simulating an audio conference and a s...


Journal of the Acoustical Society of America | 1993

Relationships between objective measurements and perceptual interpretation: The need for considering spatial emission of sound sources

Olivier Warusfel; E. Kahle; J. P. Jullien

The room acoustics laboratory at IRCAM has undertaken a series of objective and perceptual measurements in different concerts halls and opera houses throughout Europe. The measurements consisted in recording impulse responses at different seats, using a directive loudspeaker (at different locations on stage) oriented in various directions. The listening tests were undertaken during concerts with subjects seated at the measured locations. The subjects were asked to describe the room acoustic quality with the help of a structured questionnaire, based on the results of former psychoacoustic tests. The links between perception and the criteria based on the time distribution of sound energy, established in laboratory experiments, were confirmed and/or refined. Furthermore the results of this campaign emphasize dependencies of the perceptual factors on the spatial distribution of energy. Subjective description is strongly affected by orchestra size as well as location and directivity of the instruments. Thus a ...


Journal of the Acoustical Society of America | 2004

Radiation control applied to sound synthesis: An attempt for ‘‘spatial additive synthesis.’’

Olivier Warusfel; Nicolas Misdariis; Terence Caulkins; Etienne Corteel

Sound synthesis is generally focused on the reproduction of the spectral characteristics of the source or on the simulation of its physical behavior. Less attention is paid to the sound playback step which generally results in a simple diffusion on a conventional loudspeaker setup. Stating the perceptual importance of a faithful reproduction of the source radiation properties, the paper presents a method combining a synthesis engine, based on physical modeling, with a rendering system allowing an accurate control on the produced sound‐field. Two sound‐field synthesis models are considered. In the first one, a local 3D array of transducers is controlled by signal processing for creating elementary directivity patterns that can be further combined in order to shape a more complex radiation. The dual approach consists in surrounding the audience with transducer arrays driven by wave field synthesis in order to simulate the sound field associated to these elementary directivity patterns. In both cases, the di...


Journal of the Acoustical Society of America | 1999

Influence of sensory interactions between vision and audition on the perceptual characterization of room acoustics

Chrysanthy Nathanail; Catherine Lavandier; Jean-Dominique Polack; Olivier Warusfel

Broad evidence on sensory interactions between vision and audition suggests that visual information available to listeners‐spectators in concert halls is likely to interfere in the evaluation process of the acoustical qual‐ ity. The influence of the stage visual distance on the auditory apparent distance was studied in four magnitude estimation paradigms. Sound stimuli providing successive impressions of auditory distance were generated by a virtual room acoustics processor (‘‘Spatialisateur’’) in binaural format and presented separately or coupled with 3‐D concert hall pictures. Subjects were asked to judge the auditory distance. Pilot experiments I and II obtain a first qualitative observation of the effects and study a progressive change observed at the responses. Main experiments III and IV adopt two different procedures of stimuli presentation and improved sound reproduction and attempt a first objective evaluation of the results. The analysis of variance performed on the responses reveals a small of...

Collaboration


Dive into the Olivier Warusfel's collaboration.

Researchain Logo
Decentralizing Knowledge