Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jean-Pierre Bresciani is active.

Publication


Featured researches published by Jean-Pierre Bresciani.


Experimental Brain Research | 2005

Feeling what you hear: auditory signals can modulate tactile tap perception

Jean-Pierre Bresciani; Marc O. Ernst; Knut Drewing; Guillaume Bouyer; Vincent Maury; Abderrahmane Kheddar

We tested whether auditory sequences of beeps can modulate the tactile perception of sequences of taps (two to four taps per sequence) delivered to the index fingertip. In the first experiment, the auditory and tactile sequences were presented simultaneously. The number of beeps delivered in the auditory sequence were either the same as, less than, or more than the number of taps of the simultaneously presented tactile sequence. Though task-irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli systematically modulated subjects’ tactile perception; in other words subjects’ responses depended significantly on the number of delivered beeps. Such modulation only occurred when the auditory and tactile stimuli were similar enough. In the second experiment, we tested whether the automatic auditory-tactile integration depends on simultaneity or whether a bias can be evoked when the auditory and tactile sequence are presented in temporal asynchrony. Audition significantly modulated tactile perception when the stimuli were presented simultaneously but this effect gradually disappeared when a temporal asynchrony was introduced between auditory and tactile stimuli. These results show that when provided with auditory and tactile sensory signals that are likely to be generated by the same stimulus, the central nervous system (CNS) tends to automatically integrate these signals.


Journal of Vision | 2006

Vision and touch are automatically integrated for the perception of sequences of events

Jean-Pierre Bresciani; Franziska Dammeier; Marc O. Ernst

The purpose of the present experiment was to investigate the integration of sequences of visual and tactile events. Subjects were presented with sequences of visual flashes and tactile taps simultaneously and instructed to count either the flashes (Session 1) or the taps (Session 2). The number of flashes could differ from the number of taps by +/-1. For both sessions, the perceived number of events was significantly influenced by the number of events presented in the task-irrelevant modality. Touch had a stronger influence on vision than vision on touch. Interestingly, touch was the more reliable of the two modalities-less variable estimates when presented alone. For both sessions, the perceptual estimates were less variable when stimuli were presented in both modalities than when the task-relevant modality was presented alone. These results indicate that even when one signal is explicitly task irrelevant, sensory information tends to be automatically integrated across modalities. They also suggest that the relative weight of each sensory channel in the integration process depends on its relative reliability. The results are described using a Bayesian probabilistic model for multimodal integration that accounts for the coupling between the sensory estimates.


Neuroscience Letters | 2002

Galvanic vestibular stimulation in humans produces online arm movement deviations when reaching towards memorized visual targets.

Jean-Pierre Bresciani; Jean Blouin; K. E. Popov; Christophe Bourdin; Fabrice R. Sarlegna; Jean-Louis Vercher; Gabriel M. Gauthier

Using galvanic vestibular stimulation (GVS), we tested whether a change in vestibular input at the onset of goal-directed arm movements induces deviations in arm trajectory. Eight head-fixed standing subjects were instructed to reach for memorized visual targets in complete darkness. In half of the trials, randomly-selected, a 3 mA bipolar binaural galvanic stimulation of randomly alternating polarity was triggered by the movement onset. Results revealed significant GVS-induced directional shifts of reaching movements towards the anode side. The earliest significant deviations of hand path occurred 240 ms after stimulation onset. The likely goal of these online deviations of arm trajectory was to compensate for a vestibular-evoked apparent change in the spatial relationship between the target and the hand.


Cerebral Cortex | 2011

Contributions of the PPC to online control of visually guided reaching movements assessed with fMRI-guided TMS

A Reichenbach; Jean-Pierre Bresciani; Angelika Peer; Hh Bülthoff; Axel Thielscher

The posterior parietal cortex (PPC) plays an important role in controlling voluntary movements by continuously integrating sensory information about body state and the environment. We tested which subregions of the PPC contribute to the processing of target- and body-related visual information while reaching for an object, using a reaching paradigm with 2 types of visual perturbation: displacement of the visual target and displacement of the visual feedback about the hand position. Initially, functional magnetic resonance imaging (fMRI) was used to localize putative target areas involved in online corrections of movements in response to perturbations. The causal contribution of these areas to online correction was tested in subsequent neuronavigated transcranial magnetic stimulation (TMS) experiments. Robust TMS effects occurred at distinct anatomical sites along the anterior intraparietal sulcus (aIPS) and the anterior part of the supramarginal gyrus for both perturbations. TMS over neighboring sites did not affect online control. Our results support the hypothesis that the aIPS is more generally involved in visually guided control of movements, independent of body effectors and nature of the visual information. Furthermore, they suggest that the human network of PPC subregions controlling goal-directed visuomotor processes extends more inferiorly than previously thought. Our results also point toward a good spatial specificity of the TMS effects.


eLife | 2012

Foggy perception slows us down

P Pretto; Jean-Pierre Bresciani; Gregor Rainer; Hh Bülthoff

Visual speed is believed to be underestimated at low contrast, which has been proposed as an explanation of excessive driving speed in fog. Combining psychophysics measurements and driving simulation, we confirm that speed is underestimated when contrast is reduced uniformly for all objects of the visual scene independently of their distance from the viewer. However, we show that when contrast is reduced more for distant objects, as is the case in real fog, visual speed is actually overestimated, prompting drivers to decelerate. Using an artificial anti-fog—that is, fog characterized by better visibility for distant than for close objects, we demonstrate for the first time that perceived speed depends on the spatial distribution of contrast over the visual scene rather than the global level of contrast per se. Our results cast new light on how reduced visibility conditions affect perceived speed, providing important insight into the human visual system. DOI: http://dx.doi.org/10.7554/eLife.00031.001


Experimental Brain Research | 2005

On the nature of the vestibular control of arm-reaching movements during whole-body rotations

Jean-Pierre Bresciani; Gabriel M. Gauthier; Jean-Louis Vercher; Jean Blouin

Recent studies report efficient vestibular control of goal-directed arm movements during body motion. This contribution tested whether this control relies (a) on an updating process in which vestibular signals are used to update the perceived egocentric position of surrounding objects when body orientation changes, or (b) on a sensorimotor process, i.e. a transfer function between vestibular input and the arm motor output that preserves hand trajectory in space despite body rotation. Both processes were separately and specifically adapted. We then compared the respective influences of the adapted processes on the vestibular control of arm-reaching movements. The rationale was that if a given process underlies a given behavior, any adaptive modification of this process should give rise to observable modification of the behavior. The updating adaptation adapted the matching between vestibular input and perceived body displacement in the surrounding world. The sensorimotor adaptation adapted the matching between vestibular input and the arm motor output necessary to keep the hand fixed in space during body rotation. Only the sensorimotor adaptation significantly altered the vestibular control of arm-reaching movements. Our results therefore suggest that during passive self-motion, the vestibular control of arm-reaching movements essentially derives from a sensorimotor process by which arm motor output is modified on-line to preserve hand trajectory in space despite body displacement. In contrast, the updating process maintaining up-to-date the egocentric representation of visual space seems to contribute little to generating the required arm compensation during body rotations.


Neuroreport | 2007

Signal reliability modulates auditory-tactile integration for event counting

Jean-Pierre Bresciani; Marc O. Ernst

Sequences of auditory beeps and tactile taps were simultaneously presented and participants were instructed to focus on one of these modalities and to ignore the other. We tested whether (i) the two sensory channels bias one another and (ii) the interaction depends on the relative reliability of the channels. Audition biased tactile perception and touch biased auditory perception. Lowering the reliability of the auditory channel (i.e. the intensity of the beeps) decreased the effect of audition on touch and increased the effect of touch on audition. These results show that simultaneous auditory and tactile stimuli tend to be automatically integrated in a reliability-dependent manner.


The Journal of Physiology | 2009

Seeing the hand while reaching speeds up on‐line responses to a sudden change in target position

A Reichenbach; Axel Thielscher; Angelika Peer; Hh Bülthoff; Jean-Pierre Bresciani

Goal‐directed movements are executed under the permanent supervision of the central nervous system, which continuously processes sensory afferents and triggers on‐line corrections if movement accuracy seems to be compromised. For arm reaching movements, visual information about the hand plays an important role in this supervision, notably improving reaching accuracy. Here, we tested whether visual feedback of the hand affects the latency of on‐line responses to an external perturbation when reaching for a visual target. Two types of perturbation were used: visual perturbation consisted in changing the spatial location of the target and kinesthetic perturbation in applying a force step to the reaching arm. For both types of perturbation, the hand trajectory and the electromyographic (EMG) activity of shoulder muscles were analysed to assess whether visual feedback of the hand speeds up on‐line corrections. Without visual feedback of the hand, on‐line responses to visual perturbation exhibited the longest latency. This latency was reduced by about 10% when visual feedback of the hand was provided. On the other hand, the latency of on‐line responses to kinesthetic perturbation was independent of the availability of visual feedback of the hand. In a control experiment, we tested the effect of visual feedback of the hand on visual and kinesthetic two‐choice reaction times – for which coordinate transformation is not critical. Two‐choice reaction times were never facilitated by visual feedback of the hand. Taken together, our results suggest that visual feedback of the hand speeds up on‐line corrections when the position of the visual target with respect to the body must be re‐computed during movement execution. This facilitation probably results from the possibility to map hand‐ and target‐related information in a common visual reference frame.


Neuroreport | 2002

On-line versus off-line vestibular-evoked control of goal-directed arm movements.

Jean-Pierre Bresciani; Jean Blouin; Fabrice R. Sarlegna; Christophe Bourdin; Jean-Louis Vercher; Gabriel M. Gauthier

The present study tested whether vestibular input can be processed on-line to control goal-directed arm movements towards memorized visual targets when the whole body is passively rotated during movement execution. Subjects succeeded in compensating for current body rotation by regulating ongoing arm movements. This performance was compared to the accuracy with which subjects reached for the target when the rotation occurred before the movement. Subjects were less accurate in updating the internal representation of visual space through vestibular signals than in monitoring on-line body orientation to control arm movement. These results demonstrate that vestibular signals contribute to motor control of voluntary arm movements and suggest that the processes underlying on-line regulation of goal-directed movements are different from those underlying navigation-like behaviors.


NeuroImage | 2014

A key region in the human parietal cortex for processing proprioceptive hand feedback during reaching movements

A Reichenbach; Axel Thielscher; Angelika Peer; Hh Bülthoff; Jean-Pierre Bresciani

Seemingly effortless, we adjust our movements to continuously changing environments. After initiation of a goal-directed movement, the motor command is under constant control of sensory feedback loops. The main sensory signals contributing to movement control are vision and proprioception. Recent neuroimaging studies have focused mainly on identifying the parts of the posterior parietal cortex (PPC) that contribute to visually guided movements. We used event-related TMS and force perturbations of the reaching hand to test whether the same sub-regions of the left PPC contribute to the processing of proprioceptive-only and of multi-sensory information about hand position when reaching for a visual target. TMS over two distinct stimulation sites elicited differential effects: TMS applied over the posterior part of the medial intraparietal sulcus (mIPS) compromised reaching accuracy when proprioception was the only sensory information available for correcting the reaching error. When visual feedback of the hand was available, TMS over the anterior intraparietal sulcus (aIPS) prolonged reaching time. Our results show for the first time the causal involvement of the posterior mIPS in processing proprioceptive feedback for online reaching control, and demonstrate that distinct cortical areas process proprioceptive-only and multi-sensory information for fast feedback corrections.

Collaboration


Dive into the Jean-Pierre Bresciani's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jean Blouin

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar

Gabriel M. Gauthier

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michel Guerraz

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Axel Thielscher

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge