Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Guy Hotson is active.

Publication


Featured researches published by Guy Hotson.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2014

Demonstration of a Semi-Autonomous Hybrid Brain–Machine Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic

David P. McMullen; Guy Hotson; Kapil D. Katyal; Brock A. Wester; Matthew S. Fifer; Timothy G. McGee; Andrew L. Harris; Matthew S. Johannes; R. Jacob Vogelstein; Alan Ravitz; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocortico-graphic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p <; 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.


Journal of Neural Engineering | 2016

Individual finger control of a modular prosthetic limb using high-density electrocorticography in a human subject

Guy Hotson; David P. McMullen; Matthew S. Fifer; Matthew S. Johannes; Kapil D. Katyal; Matthew P. Para; Robert S. Armiger; William S. Anderson; Nitish V. Thakor; Brock A. Wester; Nathan E. Crone

OBJECTIVE We used native sensorimotor representations of fingers in a brain-machine interface (BMI) to achieve immediate online control of individual prosthetic fingers. APPROACH Using high gamma responses recorded with a high-density electrocorticography (ECoG) array, we rapidly mapped the functional anatomy of cued finger movements. We used these cortical maps to select ECoG electrodes for a hierarchical linear discriminant analysis classification scheme to predict: (1) if any finger was moving, and, if so, (2) which digit was moving. To account for sensory feedback, we also mapped the spatiotemporal activation elicited by vibrotactile stimulation. Finally, we used this prediction framework to provide immediate online control over individual fingers of the Johns Hopkins University Applied Physics Laboratory modular prosthetic limb. MAIN RESULTS The balanced classification accuracy for detection of movements during the online control session was 92% (chance: 50%). At the onset of movement, finger classification was 76% (chance: 20%), and 88% (chance: 25%) if the pinky and ring finger movements were coupled. Balanced accuracy of fully flexing the cued finger was 64%, and 77% had we combined pinky and ring commands. Offline decoding yielded a peak finger decoding accuracy of 96.5% (chance: 20%) when using an optimized selection of electrodes. Offline analysis demonstrated significant finger-specific activations throughout sensorimotor cortex. Activations either prior to movement onset or during sensory feedback led to discriminable finger control. SIGNIFICANCE Our results demonstrate the ability of ECoG-based BMIs to leverage the native functional anatomy of sensorimotor cortical populations to immediately control individual finger movements in real time.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2014

Simultaneous Neural Control of Simple Reaching and Grasping With the Modular Prosthetic Limb Using Intracranial EEG

Matthew S. Fifer; Guy Hotson; Brock A. Wester; David P. McMullen; Yujing Wang; Matthew S. Johannes; Kapil D. Katyal; John B. Helder; Matthew P. Para; R. Jacob Vogelstein; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

Intracranial electroencephalographic (iEEG) signals from two human subjects were used to achieve simultaneous neural control of reaching and grasping movements with the Johns Hopkins University Applied Physics Lab (JHU/APL) Modular Prosthetic Limb (MPL), a dexterous robotic prosthetic arm. We performed functional mapping of high gamma activity while the subject made reaching and grasping movements to identify task-selective electrodes. Independent, online control of reaching and grasping was then achieved using high gamma activity from a small subset of electrodes with a model trained on short blocks of reaching and grasping with no further adaptation. Classification accuracy did not decline (p <; 0.05, one-way ANOVA) over three blocks of testing in either subject. Mean classification accuracy during independently executed overt reach and grasp movements for (Subject 1, Subject 2) were (0.85, 0.81) and (0.80, 0.96), respectively, and during simultaneous execution they were (0.83, 0.88) and (0.58, 0.88), respectively. Our models leveraged knowledge of the subjects individual functional neuroanatomy for reaching and grasping movements, allowing rapid acquisition of control in a time-sensitive clinical setting. We demonstrate the potential feasibility of verifying functionally meaningful iEEG-based control of the MPL prior to chronic implantation, during which additional capabilities of the MPL might be exploited with further training.


PLOS ONE | 2014

Coarse electrocorticographic decoding of ipsilateral reach in patients with brain lesions

Guy Hotson; Matthew S. Fifer; Soumyadipta Acharya; Heather L. Benz; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

In patients with unilateral upper limb paralysis from strokes and other brain lesions, strategies for functional recovery may eventually include brain-machine interfaces (BMIs) using control signals from residual sensorimotor systems in the damaged hemisphere. When voluntary movements of the contralateral limb are not possible due to brain pathology, initial training of such a BMI may require use of the unaffected ipsilateral limb. We conducted an offline investigation of the feasibility of decoding ipsilateral upper limb movements from electrocorticographic (ECoG) recordings in three patients with different lesions of sensorimotor systems associated with upper limb control. We found that the first principal component (PC) of unconstrained, naturalistic reaching movements of the upper limb could be decoded from ipsilateral ECoG using a linear model. ECoG signal features yielding the best decoding accuracy were different across subjects. Performance saturated with very few input features. Decoding performances of 0.77, 0.73, and 0.66 (median Pearsons r between the predicted and actual first PC of movement using nine signal features) were achieved in the three subjects. The performance achieved here with small numbers of electrodes and computationally simple decoding algorithms suggests that it may be possible to control a BMI using ECoG recorded from damaged sensorimotor brain systems.


international ieee/embs conference on neural engineering | 2013

HARMONIE: A multimodal control framework for human assistive robotics

Kapil D. Katyal; Matthew S. Johannes; Timothy G. McGee; Andrew J. Harris; Robert S. Armiger; Alex H. Firpi; David P. McMullen; Guy Hotson; Matthew S. Fifer; Nathan E. Crone; R. Jacob Vogelstein; Brock A. Wester

Effective user control of highly dexterous and robotic assistive devices requires intuitive and natural modalities. Although surgically implanted brain-computer interfaces (BCIs) strive to achieve this, a number of non-invasive engineering solutions may provide a quicker path to patient use by eliminating surgical implantation. We present the development of a semi-autonomous control system that utilizes computer vision, prosthesis feedback, effector centric device control, smooth movement trajectories, and appropriate hand conformations to interact with objects of interest. Users can direct a prosthetic limb through an intuitive graphical user interface to complete multi-stage tasks using patient appropriate combinations of control inputs such as eye tracking, conventional prosthetic controls/joysticks, surface electromyography (sEMG) signals, and neural interfaces (ECoG, EEG). Aligned with activities of daily living (ADL), these tasks include directing the prosthetic to specific locations or objects, grasping of objects by modulating hand conformation, and action upon grasped objects such as self-feeding. This Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE) semi-autonomous control system lowers the users cognitive load, leaving the bulk of command and control of the device to the computer. This flexible and intuitive control system could serve patient populations ranging from wheelchair-bound quadriplegics to upper-limb amputees.


international conference on robotics and automation | 2016

High Precision Neural Decoding of Complex Movement Trajectories Using Recursive Bayesian Estimation With Dynamic Movement Primitives

Guy Hotson; Ryan J. Smith; Adam G. Rouse; Marc H. Schieber; Nitish V. Thakor; Brock A. Wester

Brain-machine interfaces (BMIs) are a rapidly progressing technology with the potential to restore function to victims of severe paralysis via neural control of robotic systems. Great strides have been made in directly mapping a users cortical activity to control of the individual degrees of freedom of robotic end-effectors. While BMIs have yet to achieve the level of reliability desired for widespread clinical use, environmental sensors (e.g., RGB-D cameras for object detection) and prior knowledge of common movement trajectories hold great potential for improving system performance. Here, we present a novel sensor fusion paradigm for BMIs that capitalizes on information able to be extracted from the environment to greatly improve the performance of control. This was accomplished by using dynamic movement primitives to model the 3-D endpoint trajectories of manipulating various objects. We then used a switching unscented Kalman filter to continuously arbitrate between the 3-D endpoint kinematics predicted by the dynamic movement primitives and control derived from neural signals. We experimentally validated our system by decoding 3-D endpoint trajectories executed by a nonhuman primate manipulating four different objects at various locations. Performance using our system showed a dramatic improvement over using neural signals alone, with median distance between actual and decoded trajectories decreasing from 31.1 to 9.9 cm, and mean correlation increasing from 0.80 to 0.98. Our results indicate that our sensor fusion framework can dramatically increase the fidelity of neural prosthetic trajectory decoding.


Archive | 2017

Brain-Machine Interface Development for Finger Movement Control

Tessy M. Lal; Guy Hotson; Matthew S. Fifer; David P. McMullen; Matthew S. Johannes; Kapil D. Katyal; Matthew P. Para; Robert S. Armiger; William S. Anderson; Nitish V. Thakor; Brock A. Wester; Nathan E. Crone

There have been many developments in brain-machine interfaces (BMI) for controlling upper limb movements such as reaching and grasping. One way to expand the usefulness of BMIs in replacing motor functions for patients with spinal cord injuries and neuromuscular disorders would be to improve the dexterity of upper limb movements performed by including more control of individual finger movements. Many studies have been focusing on understanding the organization of movement control in the sensorimotor cortex of the human brain. Finding the specific mechanisms for neural control of different movements will help focus signal acquisition and processing so as to improve BMI control of complex actions. In a recently published study, we demonstrated, for the first time, online BMI control of individual finger movements using electrocorticography recordings from the hand area of sensorimotor cortex. This study expands the possibilities for combined control of arm movements and more dexterous hand and finger movements.


Archive | 2015

Semi-autonomous Hybrid Brain-Machine Interface

David P. McMullen; Matthew S. Fifer; Brock A. Wester; Guy Hotson; Kapil D. Katyal; Matthew S. Johannes; Timothy G. McGee; Andrew L. Harris; Alan Ravitz; Michael P. McLoughlin; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

Although advanced prosthetic limbs, such as the modular prosthetic limb (MPL), are now capable of mimicking the dexterity of human limbs, brain-machine interfaces (BMIs) are not yet able to take full advantage of their capabilities. To improve BMI control of the MPL, we are developing a semi-autonomous system, the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system is designed to utilize novel control strategies including hybrid input (adding eye tracking to neural control), supervisory control (decoding high-level patient goals), and intelligent robotics (incorporating computer vision and route planning algorithms). Patients use eye gaze to indicate a desired object that has been recognized by computer vision. They then perform a desired action, such as reaching and grasping, which is decoded and carried out by the MPL via route planning algorithms. Here we present two patients, implanted with electrocorticography (ECoG) and depth electrodes, who controlled the HARMONIE system to perform reach and grasping tasks; in addition, one patient also used the HARMONIE system to simulate self-feeding. This work builds upon prior research to demonstrate the feasibility of using novel control strategies to enable patients to perform a wider variety of activities of daily living (ADLs).


international conference of the ieee engineering in medicine and biology society | 2014

Neuroprosthetic limb control with electrocorticography: approaches and challenges.

Nitish V. Thakor; Matthew S. Fifer; Guy Hotson; Heather L. Benz; Geoffrey I. Newman; Griffin Milsap; Nathan E. Crone

Advanced upper limb prosthetics, such as the Johns Hopkins Applied Physics Lab Modular Prosthetic Limb (MPL), are now available for research and preliminary clinical applications. Research attention has shifted to developing means of controlling these prostheses. Penetrating microelectrode arrays are often used in animal and human models to decode action potentials for cortical control. These arrays may suffer signal loss over the long-term and therefore should not be the only implant type investigated for chronic BMI use. Electrocorticographic (ECoG) signals from electrodes on the cortical surface may provide more stable long-term recordings. Several studies have demonstrated ECoGs potential for decoding cortical activity. As a result, clinical studies are investigating ECoG encoding of limb movement, as well as its use for interfacing with and controlling advanced prosthetic arms. This overview presents the technical state of the art in the use of ECoG in controlling prostheses. Technical limitations of the current approach and future directions are also presented.


international conference of the ieee engineering in medicine and biology society | 2012

Electrocorticographic decoding of ipsilateral reach in the setting of contralateral arm weakness from a cortical lesion

Guy Hotson; Matthew S. Fifer; Soumyadipta Acharya; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

Brain machine interfaces have the potential for restoring motor function not only in patients with amputations or lesions of efferent pathways in the spinal cord and peripheral nerves, but also patients with acquired brain lesions such as strokes and tumors. In these patients the most efficient components of cortical motor systems are not available for BMI control. Here we had the opportunity to investigate the possibility of utilizing subdural electrocorticographic (ECoG) signals to control natural reaching movements under these circumstances. In a subject with a left arm monoparesis following resection of a recurrent glioma, we found that ECoG signals recorded in remaining cortex were sufficient for decoding kinematics of natural reach movements of the nonparetic arm, ipsilateral to the ECoG recordings. The relationship between the subjects ECoG signals and reach trajectory in three dimensions, two of which were highly correlated, was captured with a computationally simple linear model (mean Pearsons r in depth dimension= 0.68, in height= 0.73, in lateral= 0.24). These results were attained with only a small subset of 7 temporal/spectral neural signal features. The small subset of neural features necessary to attain high decoding results show promise for a restorative BMI controlled solely by ipsilateral ECoG signals.

Collaboration


Dive into the Guy Hotson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nitish V. Thakor

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge