Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gabriel M. Gauthier is active.

Publication


Featured researches published by Gabriel M. Gauthier.


Journal of Experimental Psychology: Human Perception and Performance | 1989

Mechanisms of short-term saccadic adaptation.

John L. Semmlow; Gabriel M. Gauthier; Jean-Louis Vercher

A number of processes have been identified that adaptively modify oculomotor control components. The adaptive process studied here can be reliably produced over a short period of time by a visual stimulus that forces postsaccadic error. This short-term adaptive process, usually termed parametric adaptation, consists of a change in response amplitude that develops progressively over 50 to 100 training stimuli. The resulting compensation is proportional to, but substantially less than, the error induced by the training stimuli. Both increases and decreases in response amplitude can be evoked by an appropriately timed and directed movement of the stimulus target, which forces postsaccadic error. Results show that a single type of training stimulus can influence movements over a broad spatial region, provided these movements are in the same direction as the training stimulus. Experiments that map the range of modification suggest that the increasing adaptive modification operates by remapping final position, whereas the decreasing adaptive modification is achieved through an overall reduction of gain. Training stimuli that attempt to evoke both increases and decreases in the same region show a net modification equivalent to the algebraic addition of individual adaptive processes.


Neuroscience Letters | 2002

Galvanic vestibular stimulation in humans produces online arm movement deviations when reaching towards memorized visual targets.

Jean-Pierre Bresciani; Jean Blouin; K. E. Popov; Christophe Bourdin; Fabrice R. Sarlegna; Jean-Louis Vercher; Gabriel M. Gauthier

Using galvanic vestibular stimulation (GVS), we tested whether a change in vestibular input at the onset of goal-directed arm movements induces deviations in arm trajectory. Eight head-fixed standing subjects were instructed to reach for memorized visual targets in complete darkness. In half of the trials, randomly-selected, a 3 mA bipolar binaural galvanic stimulation of randomly alternating polarity was triggered by the movement onset. Results revealed significant GVS-induced directional shifts of reaching movements towards the anode side. The earliest significant deviations of hand path occurred 240 ms after stimulation onset. The likely goal of these online deviations of arm trajectory was to compensate for a vestibular-evoked apparent change in the spatial relationship between the target and the hand.


Progress in Brain Research | 2003

Role of sensory information in updating internal models of the effector during arm tracking

Jean-Louis Vercher; Frédéric Sarès; Jean Blouin; Christophe Bourdin; Gabriel M. Gauthier

This chapter is divided into three main parts. Firstly, on the basis of the literature, we will shortly discuss how the recent introduction of the concept of internal models by Daniel Wolpert and Mitsuo Kawato contributes to a better understanding of what is motor learning and what is motor adaptation. Then, we will present a model of eye-hand co-ordination during self-moved target tracking, which we used as a way to specifically address these topics. Finally, we will show some evidence about the use of proprioceptive information for updating the internal models, in the context of eye-hand co-ordination. Motor and afferent information appears to contribute to the parametric adjustment (adaptation) between arm motor command and visual information about arm motion. The study reported here was aimed at assessing the contribution of arm proprioception in building (learning) and updating (adaptation) these representations. The subjects (including a deafferented subject) had to make back and forth movements with their forearm in the horizontal plane, over learned amplitude and at constant frequency, and to track an arm-driven target with their eyes. The dynamical conditions of arm movement were altered (unexpectedly or systematically) during the movement by changing the mechanical properties of the manipulandum. The results showed a significant change of the latency and the gain of the smooth pursuit system, before and after the perturbation for the control subjects, but not for the deafferented subject. Moreover, in control subjects, vibrations of the arm muscles prevented adaptation to the mechanical perturbation. These results suggest that in a self-moved target tracking task, the arm motor system shares with the smooth pursuit system an internal representation of the arm dynamical properties, and that arm proprioception is necessary to build this internal model. As suggested by Ghez et al. (1990) (Cold Spring Harbor Symp. Quant. Biol., 55: 837-8471), proprioception would allow control subjects to learn the inertial properties of the limb.


Experimental Brain Research | 1998

Updating visual space during passive and voluntary head-in-space movements

Jean Blouin; Loris Labrousse; Martin Simoneau; Jean-Louis Vercher; Gabriel M. Gauthier

Abstract The accuracy of our spatially oriented behaviors largely depends on the precision of monitoring the change in body position with respect to space during self-motion. We investigated observers’ capacity to determine, before and after head rotations about the yaw axis, the position of a memorized earth-fixed visual target positioned 21° laterally. The subjects (n=6) showed small errors (mean=–0.6°) and little variability (mean=0.9°) in determining the position of an extinguished visual-target position when the head (and gaze) remained in a straight-ahead position. This accuracy was preserved when subjects voluntary rotated the head by various magnitudes in the direction of the memorized visual target (head rotations ranged between 5° and 60°). However, when the chair on which the subjects were seated was unexpectedly rotated about the yaw axis in the direction of the target (chair rotations ranged between 6° and 36°) during the head-on-trunk rotations, the performance was markedly decreased, both in terms of spatial precision (mean error=5.6°) and variability (mean=5.7°). A control experiment showed that the prior knowledge of chair rotation occurrence had no effect on the perceived target position after head-trunk movements. Updating an earth-fixed target position during head-on-trunk rotations could be achieved through both cervical and vestibular signals processing, but, in the present experiment, the vestibular output was the only signal that had the potentiality to contribute to accurate coding of the target position after simultaneous head and trunk movements. Our results therefore suggest that the vestibular output is a noisy signal for the central nervous signal to update the visual space during head-in-space motion.


Experimental Brain Research | 2005

On the nature of the vestibular control of arm-reaching movements during whole-body rotations

Jean-Pierre Bresciani; Gabriel M. Gauthier; Jean-Louis Vercher; Jean Blouin

Recent studies report efficient vestibular control of goal-directed arm movements during body motion. This contribution tested whether this control relies (a) on an updating process in which vestibular signals are used to update the perceived egocentric position of surrounding objects when body orientation changes, or (b) on a sensorimotor process, i.e. a transfer function between vestibular input and the arm motor output that preserves hand trajectory in space despite body rotation. Both processes were separately and specifically adapted. We then compared the respective influences of the adapted processes on the vestibular control of arm-reaching movements. The rationale was that if a given process underlies a given behavior, any adaptive modification of this process should give rise to observable modification of the behavior. The updating adaptation adapted the matching between vestibular input and perceived body displacement in the surrounding world. The sensorimotor adaptation adapted the matching between vestibular input and the arm motor output necessary to keep the hand fixed in space during body rotation. Only the sensorimotor adaptation significantly altered the vestibular control of arm-reaching movements. Our results therefore suggest that during passive self-motion, the vestibular control of arm-reaching movements essentially derives from a sensorimotor process by which arm motor output is modified on-line to preserve hand trajectory in space despite body displacement. In contrast, the updating process maintaining up-to-date the egocentric representation of visual space seems to contribute little to generating the required arm compensation during body rotations.


Neuroreport | 1995

Encoding the position of a flashed visual target after passive body rotations.

Jean Blouin; Gabriel M. Gauthier; Paul van Donkelaar; Jean-Louis Vercher

&NA; The capacity of the central nervous system (CNS) for processing vestibular signals during passive whole‐body rotations to update the internal representation of a visual target position in relation to the body was assessed. Results showed that subjects mislocalized previously presented visual targets after body rotations in complete darkness. Detailed analysis of the results suggested that the large target mislocalization stemmed not only from a systematic underestimation of rotation magnitude but also from the incapacity of the CNS to use the vestibular signals to accurately update the internal representation of the target position in relation to the body after passive rotations.


Experimental Brain Research | 1996

The relative contribution of retinal and extraretinal signals in determining the accuracy of reaching movements in normal subjects and a deafferented patient

Jean Blouin; Gabriel M. Gauthier; Jean Louis Vercher; Jonathan Cole

This experiment investigated the relative extent to which different signals from the visuo-oculomotor system are used to improve accuracy of arm movements. Different visuo-oculomotor conditions were used to produce various retinal and extraretinal signals leading to a similar target amplitude: (a) fixating a central target while pointing to a peripheral visual target, (b) tracking a target through smooth pursuit movement and then pointing to the target when its excursion ceased, and (c) pointing to a target reached previously by a saccadic eye movement. The experiment was performed with a deafferented subject and control subjects. For the deafferented patient, the absence of proprioception prevented any comparison between internal representations of target and limb (through proprioception) positions during the arm movement. The deafferented patients endpoint therefore provided a good estimate of the accuracy of the target coordinates used by the arm motor system. The deafferented subject showed relatively good accuracy by producing a saccade prior to the pointing, but large overshooting in the fixation condition and undershooting in the pursuit condition. The results suggest that the deafferented subject does use oculomotor signals to program arm movement and that signals associated with fast movements of the eyes are better for pointing accuracy than slow ramp movements. The inaccuracy of the deafferented subject when no eye movement is allowed (the condition in which the controls were the most accurate) suggests that, in this condition, a proprioceptive map is involved in which both the target and the arm are represented.


Electroencephalography and Clinical Neurophysiology | 1975

Two-dimensional eye movement monitor for clinical and laboratory recordings

Gabriel M. Gauthier; Michel Volle

A photo-electric device designed to monitor simultaneously vertical and horizontal eye movements within a 20 degrees range is presented with illustrating experimental data. Four small infrared detecting cells are mounted on a light spectacle-like frame together with a miniature true infrared (9000 A) emitting diode. This original design which eliminates separate source illumination artifacts is extremely light, preserves maximum vision field size, and has particularly straightforward operation. The instrument resolution is less than 1 minute of arc with a 1000 c/sec bandwidth and a 5% linearity over the maximum operating range.


Neuroreport | 2002

On-line versus off-line vestibular-evoked control of goal-directed arm movements.

Jean-Pierre Bresciani; Jean Blouin; Fabrice R. Sarlegna; Christophe Bourdin; Jean-Louis Vercher; Gabriel M. Gauthier

The present study tested whether vestibular input can be processed on-line to control goal-directed arm movements towards memorized visual targets when the whole body is passively rotated during movement execution. Subjects succeeded in compensating for current body rotation by regulating ongoing arm movements. This performance was compared to the accuracy with which subjects reached for the target when the rotation occurred before the movement. Subjects were less accurate in updating the internal representation of visual space through vestibular signals than in monitoring on-line body orientation to control arm movement. These results demonstrate that vestibular signals contribute to motor control of voluntary arm movements and suggest that the processes underlying on-line regulation of goal-directed movements are different from those underlying navigation-like behaviors.


Experimental Brain Research | 2002

Visual signals contribute to the coding of gaze direction

Jean Blouin; Nicolas Amade; Jean-Louis Vercher; Normand Teasdale; Gabriel M. Gauthier

Accurate information about gaze direction is required to direct the hand towards visual objects in the environment. In the present experiments, we tested whether retinal inputs affect the accuracy with which healthy subjects indicate their gaze direction with the unseen index finger after voluntary saccadic eye movements. In experiment 1, subjects produced a series of back and forth saccades (about eight) of self-selected magnitudes before positioning the eyes in a self-chosen direction to the right. The saccades were produced while facing one of four possible visual scenes: (1) complete darkness, (2) a scene composed of a single light-emitting diode (LED) located at 18° to the right, (3) a visually enriched scene made up of three LEDs located at 0°, 18° and 36° to the right, or (4) a normally illuminated scene where the lights in the experimental room were turned on. Subjects were then asked to indicate their gaze direction with their unseen index finger. In the conditions where the visual scenes were composed of LEDs, subjects were instructed to foveate or not foveate one of the LEDs with their last saccade. It was therefore possible to compare subjects’ accuracy when pointing in the direction of their gaze in conditions with and without foveal stimulation. The results showed that the accuracy of the pointing movements decreased when subjects produced their saccades in a dark environment or in the presence of a single LED compared to when the saccades were generated in richer visual environments. Visual stimulation of the fovea did not increase subjects’ accuracy when pointing in the direction of their gaze compared to conditions where there was only stimulation of the peripheral retina. Experiment 2 tested how the retinal signals could contribute to the coding of eye position after saccadic eye movements. More specifically, we tested whether the shift in the retinal image of the environment during the saccades provided information about the reached position of the eyes. Subjects produced their series of saccades while facing a visual environment made up of three LEDs. In some trials, the whole visual scene was displaced either 4.5° to the left or 3° to the right during the primary saccade. These displacements created mismatches between the shift of the retinal image of the environment and the extent of gaze deviation. The displacements of the visual scene were not perceived by the subjects because they occurred near the peak velocity of the saccade (saccadic suppression phenomenon). Pointing accuracy was not affected by the unperceived shifts of the visual scene. The results of these experiments suggest that the arm motor system receives more precise information about gaze direction when there is retinal stimulation than when there is none. They also suggest that the most relevant factor in defining gaze direction is not the retinal locus of the visual stimulation (that is peripheral or foveal) but rather the amount of visual information. Finally, the results suggest an enhanced egocentric encoding of gaze direction by the retinal inputs and do not support a retinotopic model for encoding gaze direction.

Collaboration


Dive into the Gabriel M. Gauthier's collaboration.

Top Co-Authors

Avatar

Jean Blouin

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Frédéric Sarès

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge