Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christophe Jouffrais is active.

Publication


Featured researches published by Christophe Jouffrais.


Behavioural Brain Research | 2000

Hand kinematics during reaching and grasping in the macaque monkey

Alice C. Roy; Yves Paulignan; Alessandro Farnè; Christophe Jouffrais; Driss Boussaoud

In this paper, we develop an animal model of prehension movements by examining the kinematics of reaching and grasping in monkeys and by comparing the results to published data on humans. Hand movements were recorded in three dimensions in monkeys who were trained to either point at visual targets under unperturbed and perturbed conditions, or to reach and grasp 3-D objects. The results revealed the following three similarities in the hand kinematics of monkey and man. (1) Pointing movements showed an asymmetry depending on target location relative to the hand used; in particular, movements to an ipsilateral target took longer than those to a contralateral one. (2) Perturbation of target location decreased the magnitude of the velocity peak and increased the duration of pointing movements. (3) Reaching to grasp movements displayed a bell-shaped wrist velocity profile and the maximum grip aperture was correlated with object size. These similarities indicate that the macaque monkey can be a useful model for understanding human motor control.


Experimental Brain Research | 1999

Neuronal activity related to eye-hand coordination in the primate premotor cortex.

Christophe Jouffrais; Driss Boussaoud

Abstract To test the functional implications of gaze signals that we previously reported in the dorsal premotor cortex (PMd), we trained two rhesus monkeys to point to visual targets presented on a touch screen while controlling their gaze orientation. Each monkey had to perform four different tasks. To initiate a trial, the monkey had to put his hand on a starting position at the center of the touch screen and fixate a fixation point. In one task, the animal had to make a reaching movement to a peripheral target randomly presented at one of eight possible locations on a circle while maintaining fixation at the center of this virtual circle (central fixation + reaching). In the second task, the monkey maintained fixation at the location of the upcoming peripheral target and, later, reached to that location. After a delay, the target was turned on and the monkey made a reaching arm movement (target fixation + reaching). In the third task, the monkey made a saccade to the target without any arm movement (saccade). Finally, in the fourth task, the monkey first made a saccade to the target, then reached to it after a delay (saccade + reaching). This design allowed us to examine the contribution of the oculomotor context to arm-related neuronal activity in PMd. We analyzed the effects of the task type on neuronal activity and found that many cells showed a task effect during the signal (26/60; 43%), set (16/49; 33%) and/or movement (15/54; 28%) epochs, depending on the oculomotor history. These findings, together with previously published data, suggest that PMd codes limb-movement direction in a gaze-dependent manner and may, thus, play an important role in the brain mechanisms of eye-hand coordination during visually guided reaching.


Human-Computer Interaction | 2015

Interactivity Improves Usability of Geographic Maps for Visually Impaired People

Anke M. Brock; Philippe Truillet; Bernard Oriola; Delphine Picard; Christophe Jouffrais

Tactile relief maps are used by visually impaired people to acquire mental representation of space, but they retain important limitations (limited amount of information, braille text, etc.). Interactive maps may overcome these limitations. However, usability of these two types of maps has never been compared. It is then unknown whether interactive maps are equivalent or even better solutions than traditional raised-line maps. This study presents a comparison of usability of a classical raised-line map versus an interactive map composed of a multitouch screen, a raised-line overlay, and audio output. Both maps were tested by 24 blind participants. We measured usability as efficiency, effectiveness, and satisfaction. Our results show that replacing braille with simple audio-tactile interaction significantly improved efficiency and user satisfaction. Effectiveness was not related to the map type but depended on users’ characteristics as well as the category of assessed spatial knowledge. Long-term evaluation of acquired spatial information revealed that maps, whether interactive or not, are useful to build robust survey-type mental representations in blind users. Altogether, these results are encouraging as they show that interactive maps are a good solution for improving map exploration and cognitive mapping in visually impaired people.


new technologies, mobility and security | 2011

Fusion of Artificial Vision and GPS to Improve Blind Pedestrian Positioning

Adrien Brilhault; Slim Kammoun; Olivier Gutierrez; Philippe Truillet; Christophe Jouffrais

Orientation and mobility are tremendous problems for Blind people. Assistive technologies based on Global Positioning System (GPS) could provide them with a remarkable autonomy. Unfortunately, GPS accuracy, Geographical Information System (GIS) data and map-matching techniques are adapted to vehicle navigation only, and fail in assisting pedestrian navigation, especially for the Blind. In this paper, we designed an assistive device for the Blind based on adapted GIS, and fusion of GPS and vision based positioning. The proposed assistive device may improve user positioning, even in urban environment where GPS signals are degraded. The estimated position would then be compatible with assisted navigation for the Blind. Interestingly the vision module may also answer Blind needs by providing them with situational awareness (localizing objects of interest) along the path. Note that the solution proposed for positioning could also enhance autonomous robots or vehicles localization.


Experimental Brain Research | 2008

Natural textures classification in area V4 of the macaque monkey

Fabrice Arcizet; Christophe Jouffrais; Pascal Girard

Natural texture of an object is an important cue for recognition. In real conditions, the incidence angle of light on natural textures leads to a complex pattern of micro-shading that modifies 3D rendering of surfaces. Little is known about visual processing of material properties. The present work aims to study the coding of natural textures by the neurons of area V4 of the awake macaque monkey. We used patches of natural textures issued from the CURET database and illuminated with two or three different angles with their corresponding controls (scrambled Fourier phase). We recorded the responses of V4 neurons to stimuli flashed in their receptive fields (RFs) while the macaques performed a simple fixation task. We show that a large majority of V4 neurons responded to texture patches with a strong modulation across stimuli. The analysis of those responses indicate that V4 neurons integrate first and second order parameters in the image (mean luminance, SNR, and energy), which may be used to achieve texture clustering in a multidimensional space. This clustering was comparable to that of a pyramid of Gabor filters and was not affected by illumination angles. Altogether, these results suggest that the V4 neuronal population acts as a set of filters able to classify textures independently of illumination angle. We conclude that area V4 contains mechanisms that are sensitive to the aspect of textured surfaces, even in an environment where illumination changes continuously.


interactive tabletops and surfaces | 2010

Usage of multimodal maps for blind people: why and how

Anke M. Brock; Philippe Truillet; Bernard Oriola; Christophe Jouffrais

Multimodal interactive maps are a solution for providing the blind with access to geographic information. Current projects use a tactile map set down on a monotouch display with additional sound output. In our current project we investigated the usage of multitouch displays for this purpose. In this paper, we outline our requirements concerning the appropriate multitouch tactile device and we present a first prototype. We conclude with future working propositions.


Neuroreport | 2002

Neuronal activity in primate striatum and pallidum related to bimanual motor actions.

Thierry Wannier; Jian Liu; Anne Morel; Christophe Jouffrais; Em Rouiller

To assess whether striatal and pallidal neurones may contribute to bimanual co-ordination, two macaque monkeys were trained to perform a delayed conditional sequence of co-ordinated pull and grasp movements, executed either bimanually or unimanually. Most of the 58 task-related neurones, recorded from the caudate nucleus, putamen, external and internal divisions of the globus pallidus, exhibited an activity related to the execution of the movements. Only a quarter of neurones displayed preparatory activity. The majority of units exhibited a significant modulation of activity in unimanual trials irrespective of the hand used to perform the task. In bimanual trials, one-third of units exhibited discharge patterns reflecting a bimanual synergy, suggesting a possible role for basal ganglia in inter-limb co-operation.


human factors in computing systems | 2016

MapSense: Multi-Sensory Interactive Maps for Children Living with Visual Impairments

Emeline Brulé; Gilles Bailly; Anke M. Brock; Frédéric Valentin; Grégoire Denis; Christophe Jouffrais

We report on the design process leading to the creation of MapSense, a multi-sensory interactive map for visually impaired children. We conducted a formative study in a specialized institute to understand childrens educational needs, their context of care and their preferences regarding interactive technologies. The findings (1) outline the needs for tools and methods to help children to acquire spatial skills and (2) provide four design guidelines for educational assistive technologies. Based on these findings and an iterative process, we designed and deployed MapSense in the institute during two days. It enables collaborations between children with a broad range of impairments, proposes reflective and ludic scenarios and allows caretakers to customize it as they wish. A field experiment reveals that both children and caretakers considered the system successful and empowering.


Technology and Disability | 2012

NAVIG: Guidance system for the visually impaired using virtual augmented reality

Brian F. G. Katz; Florian Dramas; Gaëtan Parseihian; Olivier Gutierrez; Slim Kammoun; Adrien Brilhault; Lucie Brunet; Mathieu Gallay; Bernard Oriola; Malika Auvray; Philippe Truillet; Michel Denis; Simon J. Thorpe; Christophe Jouffrais

Finding ones way to an unknown destination, navigating complex routes, finding inanimate objects; these are all tasks that can be challenging for the visually impaired. The project NAVIG (Navigation Assisted by artificial VIsion and GNSS) is directed towards increasing the autonomy of visually impaired users in known and unknown environments, exterior and interior, large scale and small scale, through a combination of a Global Navigation Satellite System (GNSS) and rapid visual recognition with which the precise position of the user can be determined. Relying on geographical databases and visually identified objects, the user is guided to his or her desired destination through spatialized semantic audio rendering, always maintained in the head-centered reference frame. This paper presents the overall project design and architecture of the NAVIG system. In addition, details of the new type of detection and localization device are presented in relation to guidance directives developed through participative design with potential users and educators for the visually impaired. A fundamental concept in this project is the belief that this type of assistive device is able to solve one of the major problems faced by the visually impaired: their difficulty in localizing specific objects.


international conference on computers helping people with special needs | 2012

Design and user satisfaction of interactive maps for visually impaired people

Anke M. Brock; Philippe Truillet; Bernard Oriola; Delphine Picard; Christophe Jouffrais

Multimodal interactive maps are a solution for presenting spatial information to visually impaired people. In this paper, we present an interactive multimodal map prototype that is based on a tactile paper map, a multi-touch screen and audio output. We first describe the different steps for designing an interactive map: drawing and printing the tactile paper map, choice of multi-touch technology, interaction technologies and the software architecture. Then we describe the method used to assess user satisfaction. We provide data showing that an interactive map --- although based on a unique, elementary, double tap interaction --- has been met with a high level of user satisfaction. Interestingly, satisfaction is independent of a users age, previous visual experience or Braille experience. This prototype will be used as a platform to design advanced interactions for spatial learning.

Collaboration


Dive into the Christophe Jouffrais's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Philippe Truillet

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Philippe Truillet

Centre national de la recherche scientifique

View shared research outputs
Researchain Logo
Decentralizing Knowledge