Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gaëtan Parseihian is active.

Publication


Featured researches published by Gaëtan Parseihian.


Technology and Disability | 2012

NAVIG: Guidance system for the visually impaired using virtual augmented reality

Brian F. G. Katz; Florian Dramas; Gaëtan Parseihian; Olivier Gutierrez; Slim Kammoun; Adrien Brilhault; Lucie Brunet; Mathieu Gallay; Bernard Oriola; Malika Auvray; Philippe Truillet; Michel Denis; Simon J. Thorpe; Christophe Jouffrais

Finding ones way to an unknown destination, navigating complex routes, finding inanimate objects; these are all tasks that can be challenging for the visually impaired. The project NAVIG (Navigation Assisted by artificial VIsion and GNSS) is directed towards increasing the autonomy of visually impaired users in known and unknown environments, exterior and interior, large scale and small scale, through a combination of a Global Navigation Satellite System (GNSS) and rapid visual recognition with which the precise position of the user can be determined. Relying on geographical databases and visually identified objects, the user is guided to his or her desired destination through spatialized semantic audio rendering, always maintained in the head-centered reference frame. This paper presents the overall project design and architecture of the NAVIG system. In addition, details of the new type of detection and localization device are presented in relation to guidance directives developed through participative design with potential users and educators for the visually impaired. A fundamental concept in this project is the belief that this type of assistive device is able to solve one of the major problems faced by the visually impaired: their difficulty in localizing specific objects.


IEEE Transactions on Multimedia | 2016

Comparison and Evaluation of Sonification Strategies for Guidance Tasks

Gaëtan Parseihian; Charles Gondre; Mitsuko Aramaki; Sølvi Ystad; Richard Kronland-Martinet

This paper aims to reveal the efficiency of sonification strategies in terms of rapidity, precision, and overshooting in the case of a one-dimensional guidance task. The sonification strategies are based on the four main perceptual attributes of a sound (pitch, loudness, duration/tempo, and timbre) and classified with respect to the presence or not of one or several auditory references. Perceptual evaluations are used to display the strategies in a precision/rapidity space and enable prediction of user behavior for a chosen sonification strategy. The evaluation of sonification strategies constitutes a first step toward general guidelines for sound design in interactive multimedia systems that involve guidance issues.


Frontiers in Psychology | 2018

Does proprioception influence human spatial cognition? A study on individuals with massive deafferentation

Alix G. Renault; Malika Auvray; Gaëtan Parseihian; R. Chris Miall; Jonathan Cole; Fabrice R. Sarlegna

When navigating in a spatial environment or when hearing its description, we can develop a mental model which may be represented in the central nervous system in different coordinate systems such as an egocentric or allocentric reference frame. The way in which sensory experience influences the preferred reference frame has been studied with a particular interest for the role of vision. The present study investigated the influence of proprioception on human spatial cognition. To do so, we compared the abilities to form spatial models of two rare participants chronically deprived of proprioception (GL and IW) and healthy control participants. Participants listened to verbal descriptions of a spatial environment, and their ability to form and use a mental model was assessed with a distance-comparison task and a free-recall task. Given that the loss of proprioception has been suggested to specifically impair the egocentric reference frame, the deafferented individuals were expected to perform worse than controls when the spatial environment was described in an egocentric reference frame. Results revealed that in both tasks, one deafferented individual (GL) made more errors than controls while the other (IW) made less errors. On average, both GL and IW were slower to respond than controls, and reaction time was more variable for IW. Additionally, we found that GL but not IW was impaired compared to controls in visuo-spatial imagery, which was assessed with the Minnesota Paper Form Board Test. Overall, the main finding of this study is that proprioception can influence the time necessary to use spatial representations while other factors such as visuo-spatial abilities can influence the capacity to form accurate spatial representations.


Acta Acustica United With Acustica | 2018

Perception of Surrounding Sound Source Trajectories in the Horizontal Plane: A Comparison of VBAP and Basic-Decoded HOA

Lennie Gandemer; Gaëtan Parseihian; Christophe Bourdin; Richard Kronland-Martinet

Despite the fundamental role played by sound in multiple virtual reality contexts, few studies have explored the perception of virtual sound source motion in the acoustic space. The goal of this study was to compare the localization of virtual moving sound sources rendered with two different spatialization techniques: Vector BaseAmplitude Panning (VBAP) and fifth-order Ambisonics (HOA), both implemented in a soundproofed room and in their most basic form (basic decoding of HOA, VBAP without spread parameter). The perception of virtual sound trajectories surrounding untrained subjects (n=23) was evaluated using a new method based on a drawing-augmented multiple-choice questionnaire. In the spherical loudspeaker array used in this study, VBAP proved to be a robust spatialization technique for sound trajectory rendering in terms of trajectory recognition and height perception. In basic-decoded HOA, subjects exhibited far more disparate trajectory recognition and height perception performances but performed better in perceiving sound source movement homogeneity.


I-perception | 2011

Rapid Auditory System Adaptation Using a Virtual Auditory Environment

Gaëtan Parseihian; Brian F. G. Katz

Various studies have highlighted plasticity of the auditory system from visual stimuli, limiting the trained field of perception. The aim of the present study is to investigate auditory system adaptation using an audio-kinesthetic platform. Participants were placed in a Virtual Auditory Environment allowing the association of the physical position of a virtual sound source with an alternate set of acoustic spectral cues or Head-Related Transfer Function (HRTF) through the use of a tracked ball manipulated by the subject. This set-up has the advantage to be not being limited to the visual field while also offering a natural perception-action coupling through the constant awareness of ones hand position. Adaptation process to non-individualized HRTF was realized through a spatial search game application. A total of 25 subjects participated, consisting of subjects presented with modified cues using non-individualized HRTF and a control group using individual measured HRTFs to account for any learning effect due to the game itself. The training game lasted 12 minutes and was repeated over 3 consecutive days. Adaptation effects were measured with repeated localization tests. Results showed a significant performance improvement for vertical localization and a significant reduction in the front/back confusion rate after 3 sessions.


Journal of the Acoustical Society of America | 2012

Rapid head-related transfer function adaptation using a virtual auditory environment

Gaëtan Parseihian; Brian F. G. Katz


Journal of the Acoustical Society of America | 2012

Perceptually based head-related transfer function database optimization

Brian F. G. Katz; Gaëtan Parseihian


Journal of The Audio Engineering Society | 2012

Morphocons: A New Sonification Concept Based on Morphological Earcons

Gaëtan Parseihian; Brian F. G. Katz


international conference on auditory display | 2012

Sound effect metaphors for near field distance sonification

Gaëtan Parseihian; Brian F. G. Katz; Simon Conan


Wi:Journal of Mobile Media | 2014

The process of sonification design for guidance tasks

Gaëtan Parseihian; Sølvi Ystad; Mitsuko Aramaki; Richard Kronland-Martinet

Collaboration


Dive into the Gaëtan Parseihian's collaboration.

Top Co-Authors

Avatar

Brian F. G. Katz

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Mitsuko Aramaki

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sølvi Ystad

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Charles Gondre

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Philippe Truillet

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Richard Kronland Martinet

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge