Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pierre Baraduc is active.

Publication


Featured researches published by Pierre Baraduc.


Experimental Brain Research | 1999

Parieto-frontal coding of reaching: an integrated framework

Yves Burnod; Pierre Baraduc; Alexandra Battaglia-Mayer; Emmanuel Guigon; Etienne Koechlin; Stefano Ferraina; Francesco Lacquaniti; Roberto Caminiti

Abstract In the last few years, anatomical and physiological studies have provided new insights into the organization of the parieto-frontal network underlying visually guided arm-reaching movements in at least three domains. (1) Network architecture. It has been shown that the different classes of neurons encoding information relevant to reaching are not confined within individual cortical areas, but are common to different areas, which are generally linked by reciprocal association connections. (2) Representation of information. There is evidence suggesting that reach-related populations of neurons do not encode relevant parameters within pure sensory or motor ”reference frames”, but rather combine them within hybrid dimensions. (3) Visuomotor transformation. It has been proposed that the computation of motor commands for reaching occurs as a simultaneous recruitment of discrete populations of neurons sharing similar properties in different cortical areas, rather than as a serial process from vision to movement, engaging different areas at different times. The goal of this paper was to link experimental (neurophysiological and neuroanatomical) and computational aspects within an integrated framework to illustrate how different neuronal populations in the parieto-frontal network operate a collective and distributed computation for reaching. In this framework, all dynamic (tuning, combinatorial, computational) properties of units are determined by their location relative to three main functional axes of the network, the visual-to-somatic, position-direction, and sensory-motor axis. The visual-to-somatic axis is defined by gradients of activity symmetrical to the central sulcus and distributed over both frontal and parietal cortices. At least four sets of reach-related signals (retinal, gaze, arm position/movement direction, muscle output) are represented along this axis. This architecture defines informational domains where neurons combine different inputs. The position-direction axis is identified by the regular distribution of information over large populations of neurons processing both positional and directional signals (concerning the arm, gaze, visual stimuli, etc.) Therefore, the activity of gaze- and arm-related neurons can represent virtual three-dimensional (3D) pathways for gaze shifts or hand movement. Virtual 3D pathways are thus defined by a combination of directional and positional information. The sensory-motor axis is defined by neurons displaying different temporal relationships with the different reach-related signals, such as target presentation, preparation for intended arm movement, onset of movements, etc. These properties reflect the computation performed by local networks, which are formed by two types of processing units: matching and condition units. Matching units relate different neural representations of virtual 3D pathways for gaze or hand, and can predict motor commands and their sensory consequences. Depending on the units involved, different matching operations can be learned in the network, resulting in the acquisition of different visuo-motor transformations, such as those underlying reaching to foveated targets, reaching to extrafoveal targets, and visual tracking of hand movement trajectory. Condition units link these matching operations to reinforcement contingencies and therefore can shape the collective neural recruitment along the three axes of the network. This will result in a progressive match of retinal, gaze, arm, and muscle signals suitable for moving the hand toward the target.


Current Biology | 2004

Consolidation of Dynamic Motor Learning Is Not Disrupted by rTMS of Primary Motor Cortex

Pierre Baraduc; Nicolas Lang; John C. Rothwell; Daniel M. Wolpert

Motor skills, once learned, are often retained over a long period of time. However, such learning first undergoes a period of consolidation after practice. During this time, the motor memory is susceptible to being disrupted by the performance of another motor-learning task. Recently, it was shown that repetitive transcranial magnetic stimulation (rTMS) over the primary motor cortex could disrupt the retention of a newly learned ballistic task in which subjects had to oppose their index finger and thumb as rapidly as possible. Here we investigate whether the motor cortex is similarly involved during the consolidation that follows learning novel dynamics. We applied rTMS to primary motor cortex shortly after subjects had either learned to compensate for a dynamic force field applied to their index finger or learned a ballistic finger abduction task. rTMS severely degraded the retention of the learning for the ballistic task but had no effect on retention of the dynamic force-field learning. This suggests that, unlike learning of simple ballistic skills, learning of dynamics may be stored in a more distributed manner, possibly outside the primary motor cortex.


Movement Disorders | 2006

“Paradoxical Kinesis” is not a Hallmark of Parkinson's disease but a general property of the motor system

Bénédicte Ballanger; Stéphane Thobois; Pierre Baraduc; Robert S. Turner; Emmanuel Broussolle; Michel Desmurget

Although slowness of movement is a typical feature of Parkinsons disease (PD), it has been suggested that severely disabled patients remained able to produce normal motor responses in the context of urgent or externally driven situations. To investigate this phenomenon (often designated “paradoxical kinesis”), we required PD patients and healthy subjects to press a large switch under three main conditions: Self Generated, produce the fastest possible movement; External Cue, same as Self Generated but in response to an acoustic cue; Urgent External Cue, same as External Cue with the switch controlling an electromagnet that prevented a ball falling at the bottom of a tilted ramp. Task difficulty was equalized for the two experimental groups. Results showed that external cues and urgent conditions decreased movement duration (Urgent External Cue < External Cue < Self Generated) and reaction time (Urgent External Cue < External Cue). The amount of reduction was identical in PD patients and healthy subjects. These observations show that paradoxical kineses are not a hallmark of PD or a byproduct of basal ganglia dysfunctions, but a general property of the motor system.


Proceedings of the National Academy of Sciences of the United States of America | 2014

Neural representations of ethologically relevant hand/mouth synergies in the human precentral gyrus

Michel Desmurget; Nathalie Richard; Sylvain Harquel; Pierre Baraduc; A. Szathmari; C. Mottolese; Angela Sirigu

Significance The motor repertoire of infants is narrow. Yet newborns can accurately bring their hands toward their mouth for self-feeding, thumb-sucking, or perioral exploration, thus showing fine coordinated movement synergies between the hand and mouth. Here, we show that these gestures of high ethological value are selectively encoded in the human brain and represented as integrated primitives within the precentral gyrus, a key region for sensorimotor processing. These findings have major implications for our understanding of the organization and phylogenesis of motor functions in primates. Complex motor responses are often thought to result from the combination of elemental movements represented at different neural sites. However, in monkeys, evidence indicates that some behaviors with critical ethological value, such as self-feeding, are represented as motor primitives in the precentral gyrus (PrG). In humans, such primitives have not yet been described. This could reflect well-known interspecies differences in the organization of sensorimotor regions (including PrG) or the difficulty of identifying complex neural representations in peroperative settings. To settle this alternative, we focused on the neural bases of hand/mouth synergies, a prominent example of human behavior with high ethological value. By recording motor- and somatosensory-evoked potentials in the PrG of patients undergoing brain surgery (2–60 y), we show that two complex nested neural representations can mediate hand/mouth actions within this structure: (i) a motor representation, resembling self-feeding, where electrical stimulation causes the closing hand to approach the opening mouth, and (ii) a motor–sensory representation, likely associated with perioral exploration, where cross-signal integration is accomplished at a cortical site that generates hand/arm actions while receiving mouth sensory inputs. The first finding extends to humans’ previous observations in monkeys. The second provides evidence that complex neural representations also exist for perioral exploration, a finely tuned skill requiring the combination of motor and sensory signals within a common control loop. These representations likely underlie the ability of human children and newborns to accurately produce coordinated hand/mouth movements, in an otherwise general context of motor immaturity.


Experimental Brain Research | 1998

Early motor influences on visuomotor transformations for reaching : a positive image of optic ataxia

Alexandra Battaglia Mayer; Stefano Ferraina; Barbara Marconi; James B. Bullis; Francesco Lacquaniti; Yves Burnod; Pierre Baraduc; Roberto Caminiti

Abstract Coding of reaching in the cerebral cortex is based on the operation of distributed populations of parietal and frontal neurons, whose main functional characteristics reside in their combinatorial power, i.e., in the capacity for combining different information related to the spatial aspects of reaching. The tangential distribution of reach-related neurons endowed with different functional properties changes gradually in the cortex and defines, in the parieto-frontal network, trends of functional properties. These visual-to-somatic gradients imply the existence of cortical regions of functional overlaps, i.e., of combinatorial domains, where the integration of different reach-related signals occurs. Studies of early coding of reaching in the mesial parietal areas show how somatomotor information, such as that related to arm posture and movement, influences neuronal activity in the very early stages of the visuomotor transformation underlying the composition of the motor command and is not added “downstream” in the frontal cortex. This influence is probably due to re-entrant signals traveling through fronto-parietal-association connections. Together with the gradient architecture of the network and the reciprocity of cortico-cortical connections, this implies that coding of reaching cannot be regarded as a top-down, serial sequence of coordinate transformation, each performed by a given cortical area, but as a recursive process, where different signals are progressively matched and further elaborated locally, due to intrinsic cortical connections. This model of reaching is also supported by psychophysical studies stressing the parallel processing of the different relevant parameters and the “hybrid” nature of the reference frame where they are combined. The theoretical frame presented here can also offer a background for a new interpretation of a well-known visuomotor disorder, due to superior parietal lesions, i.e., optic ataxia. More than a disconnection syndrome, this can now be interpreted as the consequence of the breakdown of the operations occurring in the combinatorial domains of the superior parietal segment of the parieto-frontal network.


Brain Research | 2000

Hand orientation for grasping depends on the direction of the reaching movement.

Agnès Roby-Brami; Nezha Bennis; Mounir Mokhtari; Pierre Baraduc

The 3D orientation of the hand for grasping was studied while subjects reached for objects placed at several locations on a horizontal board, with movements starting from three initial hand positions. The hand movements were recorded with electromagnetic sensors giving 3D position and orientation information. The study focused on the azimuth, which is the projection of the hand orientation in a horizontal plane. The hand azimuth for grasping was linearly correlated with the direction of the reaching movement and not with the object direction in head- or shoulder-centered coordinates. This relationship was valid regardless of the initial hand position. A control experiment with constant movement direction showed a weaker, probably postural, effect of object direction in shoulder-centered coordinates. We suggest that hand orientation for grasping is mainly controlled in relation to the reaching movement direction.


PLOS ONE | 2011

What are they up to? The role of sensory evidence and prior knowledge in action understanding

Valerian Chambon; Philippe Domenech; Elisabeth Pacherie; Etienne Koechlin; Pierre Baraduc; Chlöé Farrer

Explaining or predicting the behaviour of our conspecifics requires the ability to infer the intentions that motivate it. Such inferences are assumed to rely on two types of information: (1) the sensory information conveyed by movement kinematics and (2) the observers prior expectations – acquired from past experience or derived from prior knowledge. However, the respective contribution of these two sources of information is still controversial. This controversy stems in part from the fact that “intention” is an umbrella term that may embrace various sub-types each being assigned different scopes and targets. We hypothesized that variations in the scope and target of intentions may account for variations in the contribution of visual kinematics and prior knowledge to the intention inference process. To test this hypothesis, we conducted four behavioural experiments in which participants were instructed to identify different types of intention: basic intentions (i.e. simple goal of a motor act), superordinate intentions (i.e. general goal of a sequence of motor acts), or social intentions (i.e. intentions accomplished in a context of reciprocal interaction). For each of the above-mentioned intentions, we varied (1) the amount of visual information available from the action scene and (2) participants prior expectations concerning the intention that was more likely to be accomplished. First, we showed that intentional judgments depend on a consistent interaction between visual information and participants prior expectations. Moreover, we demonstrated that this interaction varied according to the type of intention to be inferred, with participants priors rather than perceptual evidence exerting a greater effect on the inference of social and superordinate intentions. The results are discussed by appealing to the specific properties of each type of intention considered and further interpreted in the light of a hierarchical model of action representation.


Journal of Computational Neuroscience | 2008

Optimality, stochasticity, and variability in motor behavior

Emmanuel Guigon; Pierre Baraduc; Michel Desmurget

Recent theories of motor control have proposed that the nervous system acts as a stochastically optimal controller, i.e. it plans and executes motor behaviors taking into account the nature and statistics of noise. Detrimental effects of noise are converted into a principled way of controlling movements. Attractive aspects of such theories are their ability to explain not only characteristic features of single motor acts, but also statistical properties of repeated actions. Here, we present a critical analysis of stochastic optimality in motor control which reveals several difficulties with this hypothesis. We show that stochastic control may not be necessary to explain the stochastic nature of motor behavior, and we propose an alternative framework, based on the action of a deterministic controller coupled with an optimal state estimator, which relieves drawbacks of stochastic optimality and appropriately explains movement variability.


PLOS ONE | 2014

Comparison of Classifiers for Decoding Sensory and Cognitive Information from Prefrontal Neuronal Populations

Elaine Astrand; Pierre Enel; Guilhem Ibos; Peter Ford Dominey; Pierre Baraduc; Suliann Ben Hamed

Decoding neuronal information is important in neuroscience, both as a basic means to understand how neuronal activity is related to cerebral function and as a processing stage in driving neuroprosthetic effectors. Here, we compare the readout performance of six commonly used classifiers at decoding two different variables encoded by the spiking activity of the non-human primate frontal eye fields (FEF): the spatial position of a visual cue, and the instructed orientation of the animals attention. While the first variable is exogenously driven by the environment, the second variable corresponds to the interpretation of the instruction conveyed by the cue; it is endogenously driven and corresponds to the output of internal cognitive operations performed on the visual attributes of the cue. These two variables were decoded using either a regularized optimal linear estimator in its explicit formulation, an optimal linear artificial neural network estimator, a non-linear artificial neural network estimator, a non-linear naïve Bayesian estimator, a non-linear Reservoir recurrent network classifier or a non-linear Support Vector Machine classifier. Our results suggest that endogenous information such as the orientation of attention can be decoded from the FEF with the same accuracy as exogenous visual information. All classifiers did not behave equally in the face of population size and heterogeneity, the available training and testing trials, the subjects behavior and the temporal structure of the variable of interest. In most situations, the regularized optimal linear estimator and the non-linear Support Vector Machine classifiers outperformed the other tested decoders.


NeuroImage | 2007

Functional anatomy of motor urgency

Stéphane Thobois; Bénédicte Ballanger; Pierre Baraduc; Didier Le Bars; Franck Lavenne; Emmanuel Broussolle; Michel Desmurget

This PET H(2)(15)O study uses a reaching task to determine the neural basis of the unconscious motor speed up observed in the context of urgency in healthy subjects. Three conditions were considered: self-initiated (produce the fastest possible movement toward a large plate, when ready), externally-cued (same as self-initiated but in response to an acoustic cue) and temporally-pressing (same as externally-cued with the plate controlling an electromagnet that prevented a rolling ball from falling at the bottom of a tilted ramp). Results show that: (1) Urgent responses (Temporally-pressing versus Externally-cued) engage the left parasagittal and lateral cerebellar hemisphere and the sensorimotor cortex (SMC) bilaterally; (2) Externally-driven responses (Externally-cued versus Self-initiated) recruit executive areas within the contralateral SMC; (3) Volitional responses (Self-initiated versus Externally-cued) involve prefrontal cortical areas. These observations are discussed with respect to the idea that neuromuscular energy is set to a submaximal threshold in self-determined situations. In more challenging tasks, this threshold is raised and the first answer of the nervous system is to optimize the response of the lateral (i.e. crossed) corticospinal tract (contralateral SMC) and ipsilateral cerebellum. In a second step, the anterior (i.e. uncrossed) corticospinal tract (ipsilateral SMC) and the contralateral cerebellum are recruited. This recruitment is akin to the strategy observed during recovery in patients with brain lesions.

Collaboration


Dive into the Pierre Baraduc's collaboration.

Top Co-Authors

Avatar

Michel Desmurget

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Emmanuel Broussolle

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Bénédicte Ballanger

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Stéphane Thobois

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Stéphane Thobois

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Jean-René Duhamel

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Sylvia Wirth

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Francesco Lacquaniti

University of Rome Tor Vergata

View shared research outputs
Top Co-Authors

Avatar

Roberto Caminiti

Sapienza University of Rome

View shared research outputs
Researchain Logo
Decentralizing Knowledge