Mikhail A. Lebedev
Duke University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mikhail A. Lebedev.
PLOS Biology | 2003
Jose M. Carmena; Mikhail A. Lebedev; Roy E. Crist; Joseph E. O'Doherty; David M. Santucci; Dragan F. Dimitrov; Parag G. Patil; Craig S. Henriquez; Miguel A. L. Nicolelis
Reaching and grasping in primates depend on the coordination of neural activity in large frontoparietal ensembles. Here we demonstrate that primates can learn to reach and grasp virtual objects by controlling a robot arm through a closed-loop brain–machine interface (BMIc) that uses multiple mathematical models to extract several motor parameters (i.e., hand position, velocity, gripping force, and the EMGs of multiple arm muscles) from the electrical activity of frontoparietal neuronal ensembles. As single neurons typically contribute to the encoding of several motor parameters, we observed that high BMIc accuracy required recording from large neuronal ensembles. Continuous BMIc operation by monkeys led to significant improvements in both model predictions and behavioral performance. Using visual feedback, monkeys succeeded in producing robot reach-and-grasp movements even when their arms did not move. Learning to operate the BMIc was paralleled by functional reorganization in multiple cortical areas, suggesting that the dynamic properties of the BMIc were incorporated into motor and sensory cortical representations.
Trends in Neurosciences | 2006
Mikhail A. Lebedev; Miguel A. L. Nicolelis
Since the original demonstration that electrical activity generated by ensembles of cortical neurons can be employed directly to control a robotic manipulator, research on brain-machine interfaces (BMIs) has experienced an impressive growth. Today BMIs designed for both experimental and clinical studies can translate raw neuronal signals into motor commands that reproduce arm reaching and hand grasping movements in artificial actuators. Clearly, these developments hold promise for the restoration of limb mobility in paralyzed subjects. However, as we review here, before this goal can be reached several bottlenecks have to be passed. These include designing a fully implantable biocompatible recording device, further developing real-time computational algorithms, introducing a method for providing the brain with sensory feedback from the actuators, and designing and building artificial prostheses that can be controlled directly by brain-derived signals. By reaching these milestones, future BMIs will be able to drive and control revolutionary prostheses that feel and act like the human arm.
Nature | 2011
Joseph E. O’Doherty; Mikhail A. Lebedev; Peter J. Ifft; Katie Z. Zhuang; Solaiman Shokur; Hannes Bleuler; Miguel A. L. Nicolelis
Brain–machine interfaces use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain–machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain–machine–brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.
Nature Reviews Neuroscience | 2009
Miguel A. L. Nicolelis; Mikhail A. Lebedev
Research on brain–machine interfaces has been ongoing for at least a decade. During this period, simultaneous recordings of the extracellular electrical activity of hundreds of individual neurons have been used for direct, real-time control of various artificial devices. Brain–machine interfaces have also added greatly to our knowledge of the fundamental physiological principles governing the operation of large neural ensembles. Further understanding of these principles is likely to have a key role in the future development of neuroprosthetics for restoring mobility in severely paralysed patients.
Proceedings of the National Academy of Sciences of the United States of America | 2009
Thomas Petermann; Tara C. Thiagarajan; Mikhail A. Lebedev; Miguel A. L. Nicolelis; Dante R. Chialvo; Dietmar Plenz
Spontaneous neuronal activity is an important property of the cerebral cortex but its spatiotemporal organization and dynamical framework remain poorly understood. Studies in reduced systems—tissue cultures, acute slices, and anesthetized rats—show that spontaneous activity forms characteristic clusters in space and time, called neuronal avalanches. Modeling studies suggest that networks with this property are poised at a critical state that optimizes input processing, information storage, and transfer, but the relevance of avalanches for fully functional cerebral systems has been controversial. Here we show that ongoing cortical synchronization in awake rhesus monkeys carries the signature of neuronal avalanches. Negative LFP deflections (nLFPs) correlate with neuronal spiking and increase in amplitude with increases in local population spike rate and synchrony. These nLFPs form neuronal avalanches that are scale-invariant in space and time and with respect to the threshold of nLFP detection. This dimension, threshold invariance, describes a fractal organization: smaller nLFPs are embedded in clusters of larger ones without destroying the spatial and temporal scale-invariance of the dynamics. These findings suggest an organization of ongoing cortical synchronization that is scale-invariant in its three fundamental dimensions—time, space, and local neuronal group size. Such scale-invariance has ontogenetic and phylogenetic implications because it allows large increases in network capacity without a fundamental reorganization of the system.
The Journal of Neuroscience | 2005
Mikhail A. Lebedev; Jose M. Carmena; Joseph E. O'Doherty; Miriam Zacksenhouse; Craig S. Henriquez; Jose C. Principe; Miguel A. L. Nicolelis
Monkeys can learn to directly control the movements of an artificial actuator by using a brain-machine interface (BMI) driven by the activity of a sample of cortical neurons. Eventually, they can do so without moving their limbs. Neuronal adaptations underlying the transition from control of the limb to control of the actuator are poorly understood. Here, we show that rapid modifications in neuronal representation of velocity of the hand and actuator occur in multiple cortical areas during the operation of a BMI. Initially, monkeys controlled the actuator by moving a hand-held pole. During this period, the BMI was trained to predict the actuator velocity. As the monkeys started using their cortical activity to control the actuator, the activity of individual neurons and neuronal populations became less representative of the animals hand movements while representing the movements of the actuator. As a result of this adaptation, the animals could eventually stop moving their hands yet continue to control the actuator. These results show that, during BMI control, cortical ensembles represent behaviorally significant motor parameters, even if these are not associated with movements of the animals own limb.
Frontiers in Integrative Neuroscience | 2009
Nathan A. Fitzsimmons; Mikhail A. Lebedev; Ian D. Peikon; Miguel A. L. Nicolelis
The ability to walk may be critically impacted as the result of neurological injury or disease. While recent advances in brain–machine interfaces (BMIs) have demonstrated the feasibility of upper-limb neuroprostheses, BMIs have not been evaluated as a means to restore walking. Here, we demonstrate that chronic recordings from ensembles of cortical neurons can be used to predict the kinematics of bipedal walking in rhesus macaques – both offline and in real time. Linear decoders extracted 3D coordinates of leg joints and leg muscle electromyograms from the activity of hundreds of cortical neurons. As more complex patterns of walking were produced by varying the gait speed and direction, larger neuronal populations were needed to accurately extract walking patterns. Extraction was further improved using a switching decoder which designated a submodel for each walking paradigm. We propose that BMIs may one day allow severely paralyzed patients to walk again.
Nature Methods | 2014
David Schwarz; Mikhail A. Lebedev; Timothy L. Hanson; Dragan F. Dimitrov; Gary Lehew; Jim Meloy; Sankaranarayani Rajangam; Vivek Subramanian; Peter J. Ifft; Zheng Li; Arjun Ramakrishnan; Andrew Tate; Katie Z. Zhuang; Miguel A. L. Nicolelis
Advances in techniques for recording large-scale brain activity contribute to both the elucidation of neurophysiological principles and the development of brain-machine interfaces (BMIs). Here we describe a neurophysiological paradigm for performing tethered and wireless large-scale recordings based on movable volumetric three-dimensional (3D) multielectrode implants. This approach allowed us to isolate up to 1,800 neurons (units) per animal and simultaneously record the extracellular activity of close to 500 cortical neurons, distributed across multiple cortical areas, in freely behaving rhesus monkeys. The method is expandable, in principle, to thousands of simultaneously recorded channels. It also allows increased recording longevity (5 consecutive years) and recording of a broad range of behaviors, such as social interactions, and BMI paradigms in freely moving primates. We propose that wireless large-scale recordings could have a profound impact on basic primate neurophysiology research while providing a framework for the development and testing of clinically relevant neuroprostheses.
Frontiers in Integrative Neuroscience | 2009
Joseph E. O'Doherty; Mikhail A. Lebedev; Timothy L. Hanson; Nathan A. Fitzsimmons; Miguel A. L. Nicolelis
Brain–machine interfaces (BMIs) establish direct communication between the brain and artificial actuators. As such, they hold considerable promise for restoring mobility and communication in patients suffering from severe body paralysis. To achieve this end, future BMIs must also provide a means for delivering sensory signals from the actuators back to the brain. Prosthetic sensation is needed so that neuroprostheses can be better perceived and controlled. Here we show that a direct intracortical input can be added to a BMI to instruct rhesus monkeys in choosing the direction of reaching movements generated by the BMI. Somatosensory instructions were provided to two monkeys operating the BMI using either: (a) vibrotactile stimulation of the monkeys hands or (b) multi-channel intracortical microstimulation (ICMS) delivered to the primary somatosensory cortex (S1) in one monkey and posterior parietal cortex (PP) in the other. Stimulus delivery was contingent on the position of the computer cursor: the monkey placed it in the center of the screen to receive machine–brain recursive input. After 2 weeks of training, the same level of proficiency in utilizing somatosensory information was achieved with ICMS of S1 as with the stimulus delivered to the hand skin. ICMS of PP was not effective. These results indicate that direct, bi-directional communication between the brain and neuroprosthetic devices can be achieved through the combination of chronic multi-electrode recording and microstimulation of S1. We propose that in the future, bidirectional BMIs incorporating ICMS may become an effective paradigm for sensorizing neuroprosthetic devices.
The Journal of Neuroscience | 2007
Nathan A. Fitzsimmons; W. Drake; Timothy L. Hanson; Mikhail A. Lebedev; Miguel A. L. Nicolelis
Both humans and animals can discriminate signals delivered to sensory areas of their brains using electrical microstimulation. This opens the possibility of creating an artificial sensory channel that could be implemented in neuroprosthetic devices. Although microstimulation delivered through multiple implanted electrodes could be beneficial for this purpose, appropriate microstimulation protocols have not been developed. Here, we report a series of experiments in which owl monkeys performed reaching movements guided by spatiotemporal patterns of cortical microstimulation delivered to primary somatosensory cortex through chronically implanted multielectrode arrays. The monkeys learned to discriminate microstimulation patterns, and their ability to learn new patterns and new behavioral rules improved during several months of testing. Significantly, information was conveyed to the brain through the interplay of microstimulation patterns delivered to multiple electrodes and the temporal order in which these electrodes were stimulated. This suggests multichannel microstimulation as a viable means of sensorizing neural prostheses.