Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthew S. Fifer is active.

Publication


Featured researches published by Matthew S. Fifer.


Journal of Neural Engineering | 2010

Electrocorticographic amplitude predicts finger positions during slow grasping motions of the hand

Soumyadipta Acharya; Matthew S. Fifer; Heather L. Benz; Nathan E. Crone; Nitish V. Thakor

Four human subjects undergoing subdural electrocorticography for epilepsy surgery engaged in a range of finger and hand movements. We observed that the amplitudes of the low-pass filtered electrocorticogram (ECoG), also known as the local motor potential (LMP), over specific peri-Rolandic electrodes were correlated (p < 0.001) with the position of individual fingers as the subjects engaged in slow and deliberate grasping motions. A generalized linear model (GLM) of the LMP amplitudes from those electrodes yielded predictions for positions of the fingers that had a strong congruence with the actual finger positions (correlation coefficient, r; median = 0.51, maximum = 0.91), during displacements of up to 10 cm at the fingertips. For all the subjects, decoding filters trained on data from any given session were remarkably robust in their prediction performance across multiple sessions and days, and were invariant with respect to changes in wrist angle, elbow flexion and hand placement across these sessions (median r = 0.52, maximum r = 0.86). Furthermore, a reasonable prediction accuracy for grasp aperture was achievable with as few as three electrodes in all subjects (median r = 0.49; maximum r = 0.90). These results provide further evidence for the feasibility of robust and practical ECoG-based control of finger movements in upper extremity prosthetics.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2014

Demonstration of a Semi-Autonomous Hybrid Brain–Machine Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic

David P. McMullen; Guy Hotson; Kapil D. Katyal; Brock A. Wester; Matthew S. Fifer; Timothy G. McGee; Andrew L. Harris; Matthew S. Johannes; R. Jacob Vogelstein; Alan Ravitz; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocortico-graphic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p <; 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.


Journal of Neural Engineering | 2016

Individual finger control of a modular prosthetic limb using high-density electrocorticography in a human subject

Guy Hotson; David P. McMullen; Matthew S. Fifer; Matthew S. Johannes; Kapil D. Katyal; Matthew P. Para; Robert S. Armiger; William S. Anderson; Nitish V. Thakor; Brock A. Wester; Nathan E. Crone

OBJECTIVE We used native sensorimotor representations of fingers in a brain-machine interface (BMI) to achieve immediate online control of individual prosthetic fingers. APPROACH Using high gamma responses recorded with a high-density electrocorticography (ECoG) array, we rapidly mapped the functional anatomy of cued finger movements. We used these cortical maps to select ECoG electrodes for a hierarchical linear discriminant analysis classification scheme to predict: (1) if any finger was moving, and, if so, (2) which digit was moving. To account for sensory feedback, we also mapped the spatiotemporal activation elicited by vibrotactile stimulation. Finally, we used this prediction framework to provide immediate online control over individual fingers of the Johns Hopkins University Applied Physics Laboratory modular prosthetic limb. MAIN RESULTS The balanced classification accuracy for detection of movements during the online control session was 92% (chance: 50%). At the onset of movement, finger classification was 76% (chance: 20%), and 88% (chance: 25%) if the pinky and ring finger movements were coupled. Balanced accuracy of fully flexing the cued finger was 64%, and 77% had we combined pinky and ring commands. Offline decoding yielded a peak finger decoding accuracy of 96.5% (chance: 20%) when using an optimized selection of electrodes. Offline analysis demonstrated significant finger-specific activations throughout sensorimotor cortex. Activations either prior to movement onset or during sensory feedback led to discriminable finger control. SIGNIFICANCE Our results demonstrate the ability of ECoG-based BMIs to leverage the native functional anatomy of sensorimotor cortical populations to immediately control individual finger movements in real time.


IEEE Pulse | 2012

Toward Electrocorticographic Control of a Dexterous Upper Limb Prosthesis: Building Brain-Machine Interfaces

Matthew S. Fifer; Soumyadipta Acharya; Heather L. Benz; Mohsen Mollazadeh; Nathan E. Crone; Nitish V. Thakor

In this paper, an ECoG-based system for controlling the MPL where patients were implanted with ECoG electrode grids for clinical seizure mapping and asked to perform various recorded finger or grasp movements.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2014

Simultaneous Neural Control of Simple Reaching and Grasping With the Modular Prosthetic Limb Using Intracranial EEG

Matthew S. Fifer; Guy Hotson; Brock A. Wester; David P. McMullen; Yujing Wang; Matthew S. Johannes; Kapil D. Katyal; John B. Helder; Matthew P. Para; R. Jacob Vogelstein; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

Intracranial electroencephalographic (iEEG) signals from two human subjects were used to achieve simultaneous neural control of reaching and grasping movements with the Johns Hopkins University Applied Physics Lab (JHU/APL) Modular Prosthetic Limb (MPL), a dexterous robotic prosthetic arm. We performed functional mapping of high gamma activity while the subject made reaching and grasping movements to identify task-selective electrodes. Independent, online control of reaching and grasping was then achieved using high gamma activity from a small subset of electrodes with a model trained on short blocks of reaching and grasping with no further adaptation. Classification accuracy did not decline (p <; 0.05, one-way ANOVA) over three blocks of testing in either subject. Mean classification accuracy during independently executed overt reach and grasp movements for (Subject 1, Subject 2) were (0.85, 0.81) and (0.80, 0.96), respectively, and during simultaneous execution they were (0.83, 0.88) and (0.58, 0.88), respectively. Our models leveraged knowledge of the subjects individual functional neuroanatomy for reaching and grasping movements, allowing rapid acquisition of control in a time-sensitive clinical setting. We demonstrate the potential feasibility of verifying functionally meaningful iEEG-based control of the MPL prior to chronic implantation, during which additional capabilities of the MPL might be exploited with further training.


Neurology | 2016

Spatial-temporal functional mapping of language at the bedside with electrocorticography

Yujing Wang; Matthew S. Fifer; Adeen Flinker; Anna Korzeniewska; Mackenzie C. Cervenka; William S. Anderson; Dana Boatman-Reich; Nathan E. Crone

Objective: To investigate the feasibility and clinical utility of using passive electrocorticography (ECoG) for online spatial-temporal functional mapping (STFM) of language cortex in patients being monitored for epilepsy surgery. Methods: We developed and tested an online system that exploits ECoGs temporal resolution to display the evolution of statistically significant high gamma (70–110 Hz) responses across all recording sites activated by a discrete cognitive task. We illustrate how this spatial-temporal evolution can be used to study the function of individual recording sites engaged during different language tasks, and how this approach can be particularly useful for mapping eloquent cortex. Results: Using electrocortical stimulation mapping (ESM) as the clinical gold standard for localizing language cortex, the average sensitivity and specificity of online STFM across 7 patients were 69.9% and 83.5%, respectively. Moreover, relative to regions of interest where discrete cortical lesions have most reliably caused language impairments in the literature, the sensitivity of STFM was significantly greater than that of ESM, while its specificity was also greater than that of ESM, though not significantly so. Conclusions: This study supports the feasibility and clinical utility of online STFM for mapping human language function, particularly under clinical circumstances in which time is limited and comprehensive ESM is impractical.


international conference of the ieee engineering in medicine and biology society | 2011

Asynchronous decoding of grasp aperture from human ECoG during a reach-to-grasp task

Matthew S. Fifer; Mohsen Mollazadeh; Soumyadipta Acharya; Nitish V. Thakor; Nathan E. Crone

Recent studies in primate neurophysiology have focused on decoding multi-joint kinematics from single unit and local field potential recordings. However, the extent to which these results can be generalized to human subjects is not known. We have recorded simultaneous electrocorticographic (ECoG) and hand kinematics in a human subject performing reach-grasp-hold of objects varying in shape and size. All Spectral features in various gamma bands (30–50 Hz, 70–100 Hz and 100–150 Hz frequency bands) were able to predict the time course of grasp aperture with high correlation (max r = 0.80) using as few as one ECoG feature from a single electrode (max r for single feature = 0.75) in single trials without prior knowledge of task timing. These results suggest that the population activity captured with ECoG contains information about coordinated finger movements that potentially can be exploited to control advanced upper limb neuroprosthetics.


PLOS ONE | 2014

Coarse electrocorticographic decoding of ipsilateral reach in patients with brain lesions

Guy Hotson; Matthew S. Fifer; Soumyadipta Acharya; Heather L. Benz; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

In patients with unilateral upper limb paralysis from strokes and other brain lesions, strategies for functional recovery may eventually include brain-machine interfaces (BMIs) using control signals from residual sensorimotor systems in the damaged hemisphere. When voluntary movements of the contralateral limb are not possible due to brain pathology, initial training of such a BMI may require use of the unaffected ipsilateral limb. We conducted an offline investigation of the feasibility of decoding ipsilateral upper limb movements from electrocorticographic (ECoG) recordings in three patients with different lesions of sensorimotor systems associated with upper limb control. We found that the first principal component (PC) of unconstrained, naturalistic reaching movements of the upper limb could be decoded from ipsilateral ECoG using a linear model. ECoG signal features yielding the best decoding accuracy were different across subjects. Performance saturated with very few input features. Decoding performances of 0.77, 0.73, and 0.66 (median Pearsons r between the predicted and actual first PC of movement using nine signal features) were achieved in the three subjects. The performance achieved here with small numbers of electrodes and computationally simple decoding algorithms suggests that it may be possible to control a BMI using ECoG recorded from damaged sensorimotor brain systems.


international ieee/embs conference on neural engineering | 2013

HARMONIE: A multimodal control framework for human assistive robotics

Kapil D. Katyal; Matthew S. Johannes; Timothy G. McGee; Andrew J. Harris; Robert S. Armiger; Alex H. Firpi; David P. McMullen; Guy Hotson; Matthew S. Fifer; Nathan E. Crone; R. Jacob Vogelstein; Brock A. Wester

Effective user control of highly dexterous and robotic assistive devices requires intuitive and natural modalities. Although surgically implanted brain-computer interfaces (BCIs) strive to achieve this, a number of non-invasive engineering solutions may provide a quicker path to patient use by eliminating surgical implantation. We present the development of a semi-autonomous control system that utilizes computer vision, prosthesis feedback, effector centric device control, smooth movement trajectories, and appropriate hand conformations to interact with objects of interest. Users can direct a prosthetic limb through an intuitive graphical user interface to complete multi-stage tasks using patient appropriate combinations of control inputs such as eye tracking, conventional prosthetic controls/joysticks, surface electromyography (sEMG) signals, and neural interfaces (ECoG, EEG). Aligned with activities of daily living (ADL), these tasks include directing the prosthetic to specific locations or objects, grasping of objects by modulating hand conformation, and action upon grasped objects such as self-feeding. This Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE) semi-autonomous control system lowers the users cognitive load, leaving the bulk of command and control of the device to the computer. This flexible and intuitive control system could serve patient populations ranging from wheelchair-bound quadriplegics to upper-limb amputees.


NeuroImage | 2016

Cortical subnetwork dynamics during human language tasks

Maxwell J. Collard; Matthew S. Fifer; Heather L. Benz; David P. McMullen; Yujing Wang; Griffin Milsap; Anna Korzeniewska; Nathan E. Crone

Language tasks require the coordinated activation of multiple subnetworks-groups of related cortical interactions involved in specific components of task processing. Although electrocorticography (ECoG) has sufficient temporal and spatial resolution to capture the dynamics of event-related interactions between cortical sites, it is difficult to decompose these complex spatiotemporal patterns into functionally discrete subnetworks without explicit knowledge of each subnetworks timing. We hypothesized that subnetworks corresponding to distinct components of task-related processing could be identified as groups of interactions with co-varying strengths. In this study, five subjects implanted with ECoG grids over language areas performed word repetition and picture naming. We estimated the interaction strength between each pair of electrodes during each task using a time-varying dynamic Bayesian network (tvDBN) model constructed from the power of high gamma (70-110Hz) activity, a surrogate for population firing rates. We then reduced the dimensionality of this model using principal component analysis (PCA) to identify groups of interactions with co-varying strengths, which we term functional network components (FNCs). This data-driven technique estimates both the weight of each interactions contribution to a particular subnetwork, and the temporal profile of each subnetworks activation during the task. We found FNCs with temporal and anatomical features consistent with articulatory preparation in both tasks, and with auditory and visual processing in the word repetition and picture naming tasks, respectively. These FNCs were highly consistent between subjects with similar electrode placement, and were robust enough to be characterized in single trials. Furthermore, the interaction patterns uncovered by FNC analysis correlated well with recent literature suggesting important functional-anatomical distinctions between processing external and self-produced speech. Our results demonstrate that subnetwork decomposition of event-related cortical interactions is a powerful paradigm for interpreting the rich dynamics of large-scale, distributed cortical networks during human cognitive tasks.

Collaboration


Dive into the Matthew S. Fifer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nitish V. Thakor

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Guy Hotson

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge