Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bijan Pesaran is active.

Publication


Featured researches published by Bijan Pesaran.


Nature Neuroscience | 2002

Temporal structure in neuronal activity during working memory in macaque parietal cortex.

Bijan Pesaran; John S. Pezaris; Maneesh Sahani; Partha P. Mitra; Richard A. Andersen

Many cortical structures have elevated firing rates during working memory, but it is not known how the activity is maintained. To investigate whether reverberating activity is important, we studied the temporal structure of local field potential (LFP) activity and spiking from area LIP in two awake macaques during a memory-saccade task. Using spectral analysis, we found spatially tuned elevated power in the gamma band (25–90 Hz) in LFP and spiking activity during the memory period. Spiking and LFP activity were also coherent in the gamma band but not at lower frequencies. Finally, we decoded LFP activity on a single-trial basis and found that LFP activity in parietal cortex discriminated between preferred and anti-preferred direction with approximately the same accuracy as the spike rate and predicted the time of a planned movement with better accuracy than the spike rate. This finding could accelerate the development of a cortical neural prosthesis.


Animal Behaviour | 2000

A procedure for an automated measurement of song similarity

Ofer Tchernichovski; Fernando Nottebohm; Ching Elizabeth Ho; Bijan Pesaran; Partha P. Mitra

Assessment of vocal imitation requires a widely accepted way of describing and measuring any similarities between the song of a tutor and that of its pupil. Quantifying the similarity between two songs, however, can be difficult and fraught with subjective bias. We present a fully automated procedure that measures parametrically the similarity between songs. We tested its performance on a large database of zebra finch, Taeniopygia guttata, songs. The procedure uses an analytical framework of modern spectral analysis to characterize the acoustic structure of a song. This analysis provides a superior sound spectrogram that is then reduced to a set of simple acoustic features. Based on these features, the procedure detects similar sections between songs automatically. In addition, the procedure can be used to examine: (1) imitation accuracy across acoustic features; (2) song development; (3) the effect of brain lesions on specific song features; and (4) variability across different renditions of a song or a call produced by the same individual, across individuals and across populations. By making the procedure available we hope to promote the adoption of a standard, automated method for measuring similarity between songs or calls. Copyright 2000 The Association for the Study of Animal Behaviour.


Nature | 2008

Free choice activates a decision circuit between frontal and parietal cortex

Bijan Pesaran; Matthew J. Nelson; Richard A. Andersen

We often face alternatives that we are free to choose between. Planning movements to select an alternative involves several areas in frontal and parietal cortex that are anatomically connected into long-range circuits. These areas must coordinate their activity to select a common movement goal, but how neural circuits make decisions remains poorly understood. Here we simultaneously record from the dorsal premotor area (PMd) in frontal cortex and the parietal reach region (PRR) in parietal cortex to investigate neural circuit mechanisms for decision making. We find that correlations in spike and local field potential (LFP) activity between these areas are greater when monkeys are freely making choices than when they are following instructions. We propose that a decision circuit featuring a sub-population of cells in frontal and parietal cortex may exchange information to coordinate activity between these areas. Cells participating in this decision circuit may influence movement choices by providing a common bias to the selection of movement goals.


Current Opinion in Neurobiology | 2004

Selecting the signals for a brain-machine interface

Richard A. Andersen; Sam Musallam; Bijan Pesaran

Brain-machine interfaces are being developed to assist paralyzed patients by enabling them to operate machines with recordings of their own neural activity. Recent studies show that motor parameters, such as hand trajectory, and cognitive parameters, such as the goal and predicted value of an action, can be decoded from the recorded activity to provide control signals. Neural prosthetics that use simultaneously a variety of cognitive and motor signals can maximize the ability of patients to communicate and interact with the outside world. Although most studies have recorded electroencephalograms or spike activity, recent research shows that local field potentials (LFPs) offer a promising additional signal. The decode performances of LFPs and spike signals are comparable and, because LFP recordings are more long lasting, they might help to increase the lifetime of the prosthetics.


Neuron | 2006

Dorsal Premotor Neurons Encode the Relative Position of the Hand, Eye, and Goal during Reach Planning

Bijan Pesaran; Matthew J. Nelson; Richard A. Andersen

When reaching to grasp an object, we often move our arm and orient our gaze together. How are these movements coordinated? To investigate this question, we studied neuronal activity in the dorsal premotor area (PMd) and the medial intraparietal area (area MIP) of two monkeys while systematically varying the starting position of the hand and eye during reaching. PMd neurons encoded the relative position of the target, hand, and eye. MIP neurons encoded target location with respect to the eye only. These results indicate that whereas MIP encodes target locations in an eye-centered reference frame, PMd uses a relative position code that specifies the differences in locations between all three variables. Such a relative position code may play an important role in coordinating hand and eye movements by computing their relative position.


Nature | 1998

The role of nonlinear dynamics of the syrinx in the vocalizations of a songbird

Michale S. Fee; Boris I. Shraiman; Bijan Pesaran; Partha P. Mitra

Birdsong is characterized by the modulation of sound properties over a wide range of timescales. Understanding the mechanisms by which the brain organizes this complex temporal behaviour is a central motivation in the study of the song control and learning system. Here we present evidence that, in addition to central neural control, a further level of temporal organization is provided by nonlinear oscillatory dynamics that are intrinsic to the avian vocal organ. A detailed temporal and spectral examination of song of the zebra finch (Taeniopygia guttata) reveals a class of rapid song modulations that are consistent with transitions in the dynamical state of the syrinx. Furthermore, in vitro experiments show that the syrinx can produce a sequence of oscillatory states that are both spectrally and temporally complex in response to the slow variation of respiratory or syringeal parameters. As a consequence, simple variations in a small number of neural signals can result in a complex acoustic sequence.


Neuroreport | 2003

Neural prosthetic control signals from plan activity

Krishna V. Shenoy; Daniella Meeker; Shiyan Cao; Sohaib A. Kureshi; Bijan Pesaran; Christopher A. Buneo; Aaron P. Batista; Partha P. Mitra; Joel W. Burdick; Richard A. Andersen

The prospect of assisting disabled patients by translating neural activity from the brain into control signals for prosthetic devices, has flourished in recent years. Current systems rely on neural activity present during natural arm movements. We propose here that neural activity present before or even without natural arm movements can provide an important, and potentially advantageous, source of control signals. To demonstrate how control signals can be derived from such plan activity we performed a computational study with neural activity previously recorded from the posterior parietal cortex of rhesus monkeys planning arm movements. We employed maximum likelihood decoders to estimate movement direction and to drive finite state machines governing when to move. Performance exceeded 90% with as few as 40 neurons.


Trends in Cognitive Sciences | 2004

Cognitive Neural Prosthetics

Richard A. Andersen; Joel W. Burdick; Sam Musallam; Bijan Pesaran; Jorge G. Cham

Research on neural prosthetics has focused largely on using activity related to hand trajectories recorded from motor cortical areas. An interesting question revolves around what other signals might be read out from the brain and used for neural prosthetic applications. Recent studies indicate that goals and expected value are among the high-level cognitive signals that can be used and will potentially enhance the ability of paralyzed patients to communicate with the outside world. Other new findings show that local field potentials provide an excellent source of information about the cognitive state of the subject and are much easier to record and maintain than spike activity. Finally, new movable probe technologies will enable recording electrodes to seek out automatically the best signals for decoding cognitive variables.


Nature | 2014

Sensory-motor transformations for speech occur bilaterally

Gregory B. Cogan; Thomas Thesen; Chad Carlson; Werner K. Doyle; Orrin Devinsky; Bijan Pesaran

Historically, the study of speech processing has emphasized a strong link between auditory perceptual input and motor production output. A kind of ‘parity’ is essential, as both perception- and production-based representations must form a unified interface to facilitate access to higher-order language processes such as syntax and semantics, believed to be computed in the dominant, typically left hemisphere. Although various theories have been proposed to unite perception and production, the underlying neural mechanisms are unclear. Early models of speech and language processing proposed that perceptual processing occurred in the left posterior superior temporal gyrus (Wernicke’s area) and motor production processes occurred in the left inferior frontal gyrus (Broca’s area). Sensory activity was proposed to link to production activity through connecting fibre tracts, forming the left lateralized speech sensory–motor system. Although recent evidence indicates that speech perception occurs bilaterally, prevailing models maintain that the speech sensory–motor system is left lateralized and facilitates the transformation from sensory-based auditory representations to motor-based production representations. However, evidence for the lateralized computation of sensory–motor speech transformations is indirect and primarily comes from stroke patients that have speech repetition deficits (conduction aphasia) and studies using covert speech and haemodynamic functional imaging. Whether the speech sensory–motor system is lateralized, like higher-order language processes, or bilateral, like speech perception, is controversial. Here we use direct neural recordings in subjects performing sensory–motor tasks involving overt speech production to show that sensory–motor transformations occur bilaterally. We demonstrate that electrodes over bilateral inferior frontal, inferior parietal, superior temporal, premotor and somatosensory cortices exhibit robust sensory–motor neural responses during both perception and production in an overt word-repetition task. Using a non-word transformation task, we show that bilateral sensory–motor responses can perform transformations between speech-perception- and speech-production-based representations. These results establish a bilateral sublexical speech sensory–motor system.


The Journal of Neuroscience | 2009

Human reinforcement learning subdivides structured action spaces by learning effector-specific values.

Samuel J. Gershman; Bijan Pesaran; Nathaniel D. Daw

Humans and animals are endowed with a large number of effectors. Although this enables great behavioral flexibility, it presents an equally formidable reinforcement learning problem of discovering which actions are most valuable because of the high dimensionality of the action space. An unresolved question is how neural systems for reinforcement learning—such as prediction error signals for action valuation associated with dopamine and the striatum—can cope with this “curse of dimensionality.” We propose a reinforcement learning framework that allows for learned action valuations to be decomposed into effector-specific components when appropriate to a task, and test it by studying to what extent human behavior and blood oxygen level-dependent (BOLD) activity can exploit such a decomposition in a multieffector choice task. Subjects made simultaneous decisions with their left and right hands and received separate reward feedback for each hand movement. We found that choice behavior was better described by a learning model that decomposed the values of bimanual movements into separate values for each effector, rather than a traditional model that treated the bimanual actions as unitary with a single value. A decomposition of value into effector-specific components was also observed in value-related BOLD signaling, in the form of lateralized biases in striatal correlates of prediction error and anticipatory value correlates in the intraparietal sulcus. These results suggest that the human brain can use decomposed value representations to “divide and conquer” reinforcement learning over high-dimensional action spaces.

Collaboration


Dive into the Bijan Pesaran's collaboration.

Top Co-Authors

Avatar

Richard A. Andersen

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Partha P. Mitra

Cold Spring Harbor Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniella Meeker

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joel W. Burdick

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge