Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Harshavardhan Agashe is active.

Publication


Featured researches published by Harshavardhan Agashe.


Frontiers in Neuroscience | 2015

Global cortical activity predicts shape of hand during grasping

Harshavardhan Agashe; Andrew Y. Paek; Yuhang Zhang; Jose L. Contreras-Vidal

Recent studies show that the amplitude of cortical field potentials is modulated in the time domain by grasping kinematics. However, it is unknown if these low frequency modulations persist and contain enough information to decode grasp kinematics in macro-scale activity measured at the scalp via electroencephalography (EEG). Further, it is unclear as to whether joint angle velocities or movement synergies are the optimal kinematics spaces to decode. In this offline decoding study, we infer from human EEG, hand joint angular velocities as well as synergistic trajectories as subjects perform natural reach-to-grasp movements. Decoding accuracy, measured as the correlation coefficient (r) between the predicted and actual movement kinematics, was r = 0.49 ± 0.02 across 15 hand joints. Across the first three kinematic synergies, decoding accuracies were r = 0.59 ± 0.04, 0.47 ± 0.06, and 0.32 ± 0.05. The spatial-temporal pattern of EEG channel recruitment showed early involvement of contralateral frontal-central scalp areas followed by later activation of central electrodes over primary sensorimotor cortical areas. Information content in EEG about the grasp type peaked at 250 ms after movement onset. The high decoding accuracies in this study are significant not only as evidence for time-domain modulation in macro-scale brain activity, but for the field of brain-machine interfaces as well. Our decoding strategy, which harnesses the neural “symphony” as opposed to local members of the neural ensemble (as in intracranial approaches), may provide a means of extracting information about motor intent for grasping without the need for penetrating electrodes and suggests that it may be soon possible to develop non-invasive neural interfaces for the control of prosthetic limbs.


international conference of the ieee engineering in medicine and biology society | 2011

Reconstructing hand kinematics during reach to grasp movements from electroencephalographic signals

Harshavardhan Agashe; Jose L. Contreras-Vidal

With continued research on brain machine interfaces (BMIs), it is now possible to control prosthetic arm position in space to a high degree of accuracy. However, a reliable decoder to infer the dexterous movements of fingers from brain activity during a natural grasping motion is still to be demonstrated. Here, we present a methodology to accurately predict and reconstruct natural hand kinematics from non-invasively recorded scalp electroencephalographic (EEG) signals during object grasping movements. The high performance of our decoder is attributed to a combination of the correct input space (time-domain amplitude modulation of delta-band smoothed EEG signals) and an optimal subset of EEG electrodes selected using a genetic algorithm. Trajectories of the joint angles were reconstructed for metacarpo-phalangeal (MCP) joints of the fingers as well as the carpo-metacarpal (CMC) and MCP joints of the thumb. High decoding accuracy (Pearsons correlation coefficient, r) between the predicted and observed trajectories (r = 0.76+0.01; averaged across joints) indicate that this technique may be suitable for use with a closed-loop real-time BMI to control grasping motion in prosthetics with high degrees of freedom. This demonstrates the first successful decoding of hand pre-shaping kinematics from noninvasive neural signals.


Frontiers in Neuroengineering | 2014

Decoding repetitive finger movements with brain activity acquired via non-invasive electroencephalography

Andrew Y. Paek; Harshavardhan Agashe; Jose L. Contreras-Vidal

We investigated how well repetitive finger tapping movements can be decoded from scalp electroencephalography (EEG) signals. A linear decoder with memory was used to infer continuous index finger angular velocities from the low-pass filtered fluctuations of the amplitude of a plurality of EEG signals distributed across the scalp. To evaluate the accuracy of the decoder, the Pearsons correlation coefficient (r) between the observed and predicted trajectories was calculated in a 10-fold cross-validation scheme. We also assessed attempts to decode finger kinematics from EEG data that was cleaned with independent component analysis (ICA), EEG data from peripheral sensors, and EEG data from rest periods. A genetic algorithm (GA) was used to select combinations of EEG channels that maximized decoding accuracies. Our results (lower quartile r = 0.18, median r = 0.36, upper quartile r = 0.50) show that delta-band EEG signals contain useful information that can be used to infer finger kinematics. Further, the highest decoding accuracies were characterized by highly correlated delta band EEG activity mostly localized to the contralateral central areas of the scalp. Spectral analysis of EEG also showed bilateral alpha band (8–13 Hz) event related desynchronizations (ERDs) and contralateral beta band (20–30 Hz) event related synchronizations (ERSs) localized over central scalp areas. Overall, this study demonstrates the feasibility of decoding finger kinematics from scalp EEG signals.


IEEE Pulse | 2012

Restoration of Whole Body Movement: Toward a Noninvasive Brain-Machine Interface System

Jose L. Contreras-Vidal; Alessandro Presacco; Harshavardhan Agashe; Andrew Y. Paek

This article highlights recent advances in the design of noninvasive neural interfaces based on the scalp electroencephalogram (EEG). The simplest of physical tasks, such as turning the page to read this article, requires an intense burst of brain activity. It happens in milliseconds and requires little conscious thought. But for amputees and stroke victims with diminished motor-sensory skills, this process can be difficult or impossible. Our team at the University of Maryland, in conjunction with the Johns Hopkins Applied Physics Laboratory (APL) and the University of Maryland School of Medicine, hopes to offer these people newfound mobility and dexterity. In separate research thrusts, were using data gleaned from scalp EEG to develop reliable brain-machine interface (BMI) systems that could soon control modern devices such as prosthetic limbs or powered robotic exoskeletons.


international conference of the ieee engineering in medicine and biology society | 2010

Toward improved sensorimotor integration and learning using upper-limb prosthetic devices

R. Brent Gillespie; Jose L. Contreras-Vidal; Patricia A. Shewokis; Marcia K. O'Malley; Jeremy D. Brown; Harshavardhan Agashe; Rodolphe J. Gentili; Alicia J. Davis

To harness the increased dexterity and sensing capabilities in advanced prosthetic device designs, amputees will require interfaces supported by novel forms of sensory feedback and novel control paradigms. We are using a motorized elbow brace to feed back grasp forces to the user in the form of extension torques about the elbow. This force display complements myoelectric control of grip closure in which EMG signals are drawn from the biceps muscle. We expect that the action/reaction coupling experienced by the biceps muscle will produce an intuitive paradigm for object manipulation, and we hope to uncover neural correlates to support this hypothesis. In this paper we present results from an experiment in which 7 able-bodied persons attempted to distinguish three objects by stiffness while grasping them under myoelectric control and feeling reaction forces displayed to their elbow. In four conditions (with and without force display, and using biceps myoelectric signals ipsilateral and contralateral to the force display,) ability to correctly identify objects was significantly increased with sensory feedback.


international conference of the ieee engineering in medicine and biology society | 2013

Decoding the evolving grasping gesture from electroencephalographic (EEG) activity

Harshavardhan Agashe; Jose L. Contreras-Vidal

Shared control is emerging as a likely strategy for controlling neuroprosthetic devices, in which users specify high level goals but the low-level implementation is carried out by the machine. In this context, predicting the discrete goal is necessary. Although grasping various objects is critical in determining independence in daily life of amputees, decoding of different grasp types from noninvasively recorded brain activity has not been investigated. Here we show results suggesting electroencephalography (EEG) is a feasible modality to extract information on grasp types from the users brain activity. We found that the information about the intended grasp increases over the grasping movement, and is significantly greater than chance up to 200 ms before movement onset.


international conference of the ieee engineering in medicine and biology society | 2010

Movement decoding from noninvasive neural signals

Jose L. Contreras-Vidal; Trent J. Bradberry; Harshavardhan Agashe

It is generally assumed that noninvasively-acquired neural signals contain an insufficient level of information for decoding or reconstructing detailed kinematics of natural, multi-joint limb movements and hand gestures. Here, we review recent findings from our laboratory at the University of Maryland showing that noninvasive scalp electroencephalography (EEG) or magnetoencephalography (MEG) can be used to continuously decode the kinematics of 2D ‘center-out’ drawing, unconstrained 3D ‘center-out’ reaching and 3D finger gesturing. These findings suggest that these ‘far-field’, extra-cranial neural signals contain rich information about the neural representation of movement at the macroscale, and thus these neural representations provide alternative methods for developing noninvasive brain-machine interfaces with wide-ranging clinical relevance and for understanding functional and pathological brain states at various stages of development and aging.


Progress in Brain Research | 2016

Multisession, noninvasive closed-loop neuroprosthetic control of grasping by upper limb amputees

Harshavardhan Agashe; Andrew Y. Paek; Jose L. Contreras-Vidal

Upper limb amputation results in a severe reduction in the quality of life of affected individuals due to their inability to easily perform activities of daily living. Brain-machine interfaces (BMIs) that translate grasping intent from the brains neural activity into prosthetic control may increase the level of natural control currently available in myoelectric prostheses. Current BMI techniques demonstrate accurate arm position and single degree-of-freedom grasp control but are invasive and require daily recalibration. In this study we tested if transradial amputees (A1 and A2) could control grasp preshaping in a prosthetic device using a noninvasive electroencephalography (EEG)-based closed-loop BMI system. Participants attempted to grasp presented objects by controlling two grasping synergies, in 12 sessions performed over 5 weeks. Prior to closed-loop control, the first six sessions included a decoder calibration phase using action observation by the participants; thereafter, the decoder was fixed to examine neuroprosthetic performance in the absence of decoder recalibration. Ability of participants to control the prosthetic was measured by the success rate of grasping; ie, the percentage of trials within a session in which presented objects were successfully grasped. Participant A1 maintained a steady success rate (63±3%) across sessions (significantly above chance [41±5%] for 11 sessions). Participant A2, who was under the influence of pharmacological treatment for depression, hormone imbalance, pain management (for phantom pain as well as shoulder joint inflammation), and drug dependence, achieved a success rate of 32±2% across sessions (significantly above chance [27±5%] in only two sessions). EEG signal quality was stable across sessions, but the decoders created during the first six sessions showed variation, indicating EEG features relevant to decoding at a smaller timescale (100ms) may not be stable. Overall, our results show that (a) an EEG-based BMI for grasping is a feasible strategy for further investigation of prosthetic control by amputees, and (b) factors that may affect brain activity such as medication need further examination to improve accuracy and stability of BMI performance.


international ieee/embs conference on neural engineering | 2013

Observation-based calibration of brain-machine interfaces for grasping

Harshavardhan Agashe; Jose L. Contreras-Vidal

Brain-machine interfaces (BMIs) are increasingly being used in rehabilitation research to improve the quality of life of clinical populations. Current BMI technology allows us to control, with a high level of accuracy, the positioning of robotic hands in space. We have shown previously that it is possible to decode the dexterous movements of fingers during grasping, from noninvasively recorded electroencephalographic (EEG) activity. Due to the absence of overt movement in clinical subjects with impaired hand function, however, it is not possible to construct decoder models directly by simultaneously recording brain activity and kinematics. The mirror neuron system is activated in a similar fashion during both overt movements and observing movements performed by other agents. Here, we investigate action-observation as a strategy to calibrate decoders for grasping in human subjects. Subjects observed while a robotic hand performed grasping movements, and decode models were calibrated using the EEG activity of the subjects and the kinematics of the robotic hand. Decoding accuracy was tested on unseen data, in an 8-fold cross validation scheme, as the correlation coefficient between the predicted and actual trajectories. High decoding accuracies were obtained (r = 0.70 ± 0.07), demonstrating the feasibility of using action-observation as a calibration technique for decoding grasping movements.


IEEE Pulse | 2013

A clinical roadmap for brain-neural machine interfaces: trainees' perspectives on the 2013 international workshop

Harshavardhan Agashe; Nikunj A. Bhagat; Andrew Y. Paek; Thomas C. Bulea

Brain-neural machine interfaces (BNMIs) are systems that allow a user to control an artificial device, such as a computer cursor or a robotic limb, through imagined movements that are measured as neural activity. They provide the potential to restore mobility for those with motor deficiencies caused by stroke, spinal cord injury, or limb amputations. Such systems would have been considered a topic of science fiction a few decades ago but are now being increasingly developed in both research and industry. Workers in this area are charged with fabricating BNMIs that are safe, effective, easy to use, and affordable for clinical populations.

Collaboration


Dive into the Harshavardhan Agashe's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge