Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christopher A. Buneo is active.

Publication


Featured researches published by Christopher A. Buneo.


Neuroreport | 2003

Neural prosthetic control signals from plan activity

Krishna V. Shenoy; Daniella Meeker; Shiyan Cao; Sohaib A. Kureshi; Bijan Pesaran; Christopher A. Buneo; Aaron P. Batista; Partha P. Mitra; Joel W. Burdick; Richard A. Andersen

The prospect of assisting disabled patients by translating neural activity from the brain into control signals for prosthetic devices, has flourished in recent years. Current systems rely on neural activity present during natural arm movements. We propose here that neural activity present before or even without natural arm movements can provide an important, and potentially advantageous, source of control signals. To demonstrate how control signals can be derived from such plan activity we performed a computational study with neural activity previously recorded from the posterior parietal cortex of rhesus monkeys planning arm movements. We employed maximum likelihood decoders to estimate movement direction and to drive finite state machines governing when to move. Performance exceeded 90% with as few as 40 neurons.


Experimental Brain Research | 1995

On the form of the internal model for reaching

Christopher A. Buneo; Jyl Boline; John F. Soechting; Richard E. Poppele

We investigated, by using simulations, possible mechanisms responsible for the errors in the direction of arm movements exhibited by deafferented patients. Two aspects of altered feedforward control were evaluated: the inability to sense initial conditions and the degradation of an internal model. A simulation which assumed no compensation for variations in initial arm configuration failed to reproduce the characteristic pattern of errors. In contrast, a simulation that assumed random variability in the generation of joint torque resulted in a distribution of handpaths which resembled some aspects of the pattern of errors exhibited by deafferented patients.


Experimental Brain Research | 2008

Time-invariant reference frames for parietal reach activity

Christopher A. Buneo; Aaron P. Batista; Murray R. Jarvis; Richard A. Andersen

Neurophysiological studies suggest that the transformation of visual signals into arm movement commands does not involve a sequential recruitment of the various reach-related regions of the cerebral cortex but a largely simultaneous activation of these areas, which form a distributed and recurrent visuomotor network. However, little is known about how the reference frames used to encode reach-related variables in a given “node” of this network vary with the time taken to generate a behavioral response. Here we show that in an instructed delay reaching task, the reference frames used to encode target location in the parietal reach region (PRR) and area 5 of the posterior parietal cortex (PPC) do not evolve dynamically in time; rather the same spatial representation exists within each area from the time target-related information is first instantiated in the network until the moment of movement execution. As previously reported, target location was encoded predominantly in eye coordinates in PRR and in both eye and hand coordinates in area 5. Thus, the different computational stages of the visuomotor transformation for reaching appear to coexist simultaneously in the parietal cortex, which may facilitate the rapid adjustment of trajectories that are a hallmark of skilled reaching behavior.


PLOS ONE | 2011

The proprioceptive map of the arm is systematic and stable, but idiosyncratic.

Liliana Rincon-Gonzalez; Christopher A. Buneo; Stephen I. Helms Tillery

Visual and somatosensory signals participate together in providing an estimate of the hands spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subjects hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences.


Journal of Neurophysiology | 2012

Integration of target and hand position signals in the posterior parietal cortex: effects of workspace and hand vision

Christopher A. Buneo; Richard A. Andersen

Previous findings suggest the posterior parietal cortex (PPC) contributes to arm movement planning by transforming target and limb position signals into a desired reach vector. However, the neural mechanisms underlying this transformation remain unclear. In the present study we examined the responses of 109 PPC neurons as movements were planned and executed to visual targets presented over a large portion of the reaching workspace. In contrast to previous studies, movements were made without concurrent visual and somatic cues about the starting position of the hand. For comparison, a subset of neurons was also examined with concurrent visual and somatic hand position cues. We found that single cells integrated target and limb position information in a very consistent manner across the reaching workspace. Approximately two-thirds of the neurons with significantly tuned activity (42/61 and 30/46 for left and right workspaces, respectively) coded targets and initial hand positions separably, indicating no hand-centered encoding, whereas the remaining one-third coded targets and hand positions inseparably, in a manner more consistent with the influence of hand-centered coordinates. The responses of both types of neurons were largely invariant with respect to the presence or absence of visual hand position cues, suggesting their corresponding coordinate frames and gain effects were unaffected by cue integration. The results suggest that the PPC uses a consistent scheme for computing reach vectors in different parts of the workspace that is robust to changes in the availability of somatic and visual cues about hand position.


Frontiers in Integrative Neuroscience | 2013

Neural correlates of learning and trajectory planning in the posterior parietal cortex

Elizabeth B. Torres; Rodrigo Quian Quiroga; He Cui; Christopher A. Buneo

The posterior parietal cortex (PPC) is thought to play an important role in the planning of visually-guided reaching movements. However, the relative roles of the various subdivisions of the PPC in this function are still poorly understood. For example, studies of dorsal area 5 point to a representation of reaches in both extrinsic (endpoint) and intrinsic (joint or muscle) coordinates, as evidenced by partial changes in preferred directions and positional discharge with changes in arm posture. In contrast, recent findings suggest that the adjacent medial intraparietal area (MIP) is involved in more abstract representations, e.g., encoding reach target in visual coordinates. Such a representation is suitable for planning reach trajectories involving shortest distance paths to targets straight ahead. However, it is currently unclear how MIP contributes to the planning of other types of trajectories, including those with various degrees of curvature. Such curved trajectories recruit different joint excursions and might help us address whether their representation in the PPC is purely in extrinsic coordinates or in intrinsic ones as well. Here we investigated the role of the PPC in these processes during an obstacle avoidance task for which the animals had not been explicitly trained. We found that PPC planning activity was predictive of both the spatial and temporal aspects of upcoming trajectories. The same PPC neurons predicted the upcoming trajectory in both endpoint and joint coordinates. The predictive power of these neurons remained stable and accurate despite concomitant motor learning across task conditions. These findings suggest the role of the PPC can be extended from specifying abstract movement goals to expressing these plans as corresponding trajectories in both endpoint and joint coordinates. Thus, the PPC appears to contribute to reach planning and approach-avoidance arm motions at multiple levels of representation.


Journal of Neurophysiology | 2013

Multimodal representation of limb endpoint position in the posterior parietal cortex

Ying Shi; Gregory Apker; Christopher A. Buneo

Understanding the neural representation of limb position is important for comprehending the control of limb movements and the maintenance of body schema, as well as for the development of neuroprosthetic systems designed to replace lost limb function. Multiple subcortical and cortical areas contribute to this representation, but its multimodal basis has largely been ignored. Regarding the parietal cortex, previous results suggest that visual information about arm position is not strongly represented in area 5, although these results were obtained under conditions in which animals were not using their arms to interact with objects in their environment, which could have affected the relative weighting of relevant sensory signals. Here we examined the multimodal basis of limb position in the superior parietal lobule (SPL) as monkeys reached to and actively maintained their arm position at multiple locations in a frontal plane. On half of the trials both visual and nonvisual feedback of the endpoint of the arm were available, while on the other trials visual feedback was withheld. Many neurons were tuned to arm position, while a smaller number were modulated by the presence/absence of visual feedback. Visual modulation generally took the form of a decrease in both firing rate and variability with limb vision and was associated with more accurate decoding of position at the population level under these conditions. These findings support a multimodal representation of limb endpoint position in the SPL but suggest that visual signals are relatively weakly represented in this area, and only at the population level.


Journal of Neurophysiology | 2010

Interacting Noise Sources Shape Patterns of Arm Movement Variability in Three-Dimensional Space

Gregory Apker; Timothy K. Darling; Christopher A. Buneo

Reaching movements are subject to noise in both the planning and execution phases of movement production. The interaction of these noise sources during natural movements is not well understood, despite its importance for understanding movement variability in neurologically intact and impaired individuals. Here we examined the interaction of planning and execution related noise during the production of unconstrained reaching movements. Subjects performed sequences of two movements to targets arranged in three vertical planes separated in depth. The starting position for each sequence was also varied in depth with the target plane; thus required movement sequences were largely contained within the vertical plane of the targets. Each final target in a sequence was approached from two different directions, and these movements were made with or without visual feedback of the moving hand. These combined aspects of the design allowed us to probe the interaction of execution and planning related noise with respect to reach endpoint variability. In agreement with previous studies, we found that reach endpoint distributions were highly anisotropic. The principal axes of movement variability were largely aligned with the depth axis, i.e., the axis along which visual planning related noise would be expected to dominate, and were not generally well aligned with the direction of the movement vector. Our results suggest that visual planning-related noise plays a dominant role in determining anisotropic patterns of endpoint variability in three-dimensional space, with execution noise adding to this variability in a movement direction-dependent manner.


international conference of the ieee engineering in medicine and biology society | 2009

Exploring the role of sensor noise in movement variability

Ying Shi; Christopher A. Buneo

Numerical simulations were used to explore the consequences of a spatially non-uniform sense of hand position on arm movements in the horizontal plane. Isotropic or anisotropic position errors were introduced into several starting hand positions and the resulting errors in movement direction were quantified. Two separate simulations were performed. In one simulation planned movement directions were defined relative to the starting position of the hand. Movement errors generated in this simulation resulted from a failure to compensate for differing initial conditions. In a second simulation planned movement directions were defined by the vector joining the sensed starting position with a fixed target position. Movement errors in this simulation resulted from both uncompensated changes in initial conditions as well as errors in movement planning. In both simulations, directional error variability generally increased for starting positions closer to the body. These effects were most pronounced for the anisotropic distribution of starting positions, particularly under conditions where movements were directed toward a fixed spatial location.


international conference of the ieee engineering in medicine and biology society | 2011

Neural mechanisms of limb position estimation in the primate brain

Ying Shi; Christopher A. Buneo

Understanding the neural mechanisms of limb position estimation is important both for comprehending the neural control of goal directed arm movements and for developing neuroprosthetic systems designed to replace lost limb function. Here we examined the role of area 5 of the posterior parietal cortex in estimating limb position based on visual and somatic (proprioceptive, efference copy) signals. Single unit recordings were obtained as monkeys reached to visual targets presented in a semi-immersive virtual reality environment. On half of the trials animals were required to maintain their limb position at these targets while receiving both visual and non-visual feedback of their arm position, while on the other trials visual feedback was withheld. When examined individually, many area 5 neurons were tuned to the position of the limb in the workspace but very few neurons modulated their firing rates based on the presence/absence of visual feedback. At the population level however decoding of limb position was somewhat more accurate when visual feedback was provided. These findings support a role for area 5 in limb position estimation but also suggest that visual signals regarding limb position are only weakly represented in this area, and only at the population level.

Collaboration


Dive into the Christopher A. Buneo's collaboration.

Top Co-Authors

Avatar

Richard A. Andersen

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Gregory Apker

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Ying Shi

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Murray R. Jarvis

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Bijan Pesaran

Center for Neural Science

View shared research outputs
Top Co-Authors

Avatar

Daniella Meeker

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge