Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthew S. Johannes is active.

Publication


Featured researches published by Matthew S. Johannes.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2013

Behavioral Demonstration of a Somatosensory Neuroprosthesis

J. Berg; John F. Dammann; Francesco Tenore; Gregg A. Tabot; Jessica L Boback; Louise R. Manfredi; M. L. Peterson; Kapil D. Katyal; Matthew S. Johannes; A. Makhlin; R. Wilcox; R. K. Franklin; R.J. Vogelstein; Nicholas G. Hatsopoulos; Sliman J. Bensmaia

Tactile sensation is critical for effective object manipulation, but current prosthetic upper limbs make no provision for delivering somesthetic feedback to the user. For individuals who require use of prosthetic limbs, this lack of feedback transforms a mundane task into one that requires extreme concentration and effort. Although vibrotactile motors and sensory substitution devices can be used to convey gross sensations, a direct neural interface is required to provide detailed and intuitive sensory feedback. In light of this, we describe the implementation of a somatosensory prosthesis with which we elicit, through intracortical microstimulation (ICMS), percepts whose magnitude is graded according to the force exerted on the prosthetic finger. Specifically, the prosthesis consists of a sensorized finger, the force output of which is converted into a regime of ICMS delivered to primary somatosensory cortex through chronically implanted multi-electrode arrays. We show that the performance of animals (Rhesus macaques) on a tactile task is equivalent whether stimuli are delivered to the native finger or to the prosthetic finger.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2014

Demonstration of a Semi-Autonomous Hybrid Brain–Machine Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic

David P. McMullen; Guy Hotson; Kapil D. Katyal; Brock A. Wester; Matthew S. Fifer; Timothy G. McGee; Andrew L. Harris; Matthew S. Johannes; R. Jacob Vogelstein; Alan Ravitz; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocortico-graphic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p <; 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.


Journal of Neural Engineering | 2016

Individual finger control of a modular prosthetic limb using high-density electrocorticography in a human subject

Guy Hotson; David P. McMullen; Matthew S. Fifer; Matthew S. Johannes; Kapil D. Katyal; Matthew P. Para; Robert S. Armiger; William S. Anderson; Nitish V. Thakor; Brock A. Wester; Nathan E. Crone

OBJECTIVE We used native sensorimotor representations of fingers in a brain-machine interface (BMI) to achieve immediate online control of individual prosthetic fingers. APPROACH Using high gamma responses recorded with a high-density electrocorticography (ECoG) array, we rapidly mapped the functional anatomy of cued finger movements. We used these cortical maps to select ECoG electrodes for a hierarchical linear discriminant analysis classification scheme to predict: (1) if any finger was moving, and, if so, (2) which digit was moving. To account for sensory feedback, we also mapped the spatiotemporal activation elicited by vibrotactile stimulation. Finally, we used this prediction framework to provide immediate online control over individual fingers of the Johns Hopkins University Applied Physics Laboratory modular prosthetic limb. MAIN RESULTS The balanced classification accuracy for detection of movements during the online control session was 92% (chance: 50%). At the onset of movement, finger classification was 76% (chance: 20%), and 88% (chance: 25%) if the pinky and ring finger movements were coupled. Balanced accuracy of fully flexing the cued finger was 64%, and 77% had we combined pinky and ring commands. Offline decoding yielded a peak finger decoding accuracy of 96.5% (chance: 20%) when using an optimized selection of electrodes. Offline analysis demonstrated significant finger-specific activations throughout sensorimotor cortex. Activations either prior to movement onset or during sensory feedback led to discriminable finger control. SIGNIFICANCE Our results demonstrate the ability of ECoG-based BMIs to leverage the native functional anatomy of sensorimotor cortical populations to immediately control individual finger movements in real time.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2014

Simultaneous Neural Control of Simple Reaching and Grasping With the Modular Prosthetic Limb Using Intracranial EEG

Matthew S. Fifer; Guy Hotson; Brock A. Wester; David P. McMullen; Yujing Wang; Matthew S. Johannes; Kapil D. Katyal; John B. Helder; Matthew P. Para; R. Jacob Vogelstein; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

Intracranial electroencephalographic (iEEG) signals from two human subjects were used to achieve simultaneous neural control of reaching and grasping movements with the Johns Hopkins University Applied Physics Lab (JHU/APL) Modular Prosthetic Limb (MPL), a dexterous robotic prosthetic arm. We performed functional mapping of high gamma activity while the subject made reaching and grasping movements to identify task-selective electrodes. Independent, online control of reaching and grasping was then achieved using high gamma activity from a small subset of electrodes with a model trained on short blocks of reaching and grasping with no further adaptation. Classification accuracy did not decline (p <; 0.05, one-way ANOVA) over three blocks of testing in either subject. Mean classification accuracy during independently executed overt reach and grasp movements for (Subject 1, Subject 2) were (0.85, 0.81) and (0.80, 0.96), respectively, and during simultaneous execution they were (0.83, 0.88) and (0.58, 0.88), respectively. Our models leveraged knowledge of the subjects individual functional neuroanatomy for reaching and grasping movements, allowing rapid acquisition of control in a time-sensitive clinical setting. We demonstrate the potential feasibility of verifying functionally meaningful iEEG-based control of the MPL prior to chronic implantation, during which additional capabilities of the MPL might be exploited with further training.


Clinical and Translational Science | 2014

Collaborative approach in the development of high-performance brain-computer interfaces for a neuroprosthetic arm: Translation from animal models to human control

Jennifer L. Collinger; Michael Kryger; Richard Barbara; Timothy Betler; Kristen Bowsher; Elke H.P. Brown; Samuel T. Clanton; Alan D. Degenhart; Stephen T. Foldes; Robert A. Gaunt; Ferenc Gyulai; Elizabeth A. Harchick; Deborah L. Harrington; John B. Helder; Timothy Hemmes; Matthew S. Johannes; Kapil D. Katyal; Geoffrey S. F. Ling; Angus J. C. McMorland; Karina Palko; Matthew P. Para; Janet Scheuermann; Andrew B. Schwartz; Elizabeth R. Skidmore; Florian Solzbacher; Anita V. Srikameswaran; Dennis P. Swanson; Scott Swetz; Elizabeth C. Tyler-Kabara; Meel Velliste

Our research group recently demonstrated that a person with tetraplegia could use a brain–computer interface (BCI) to control a sophisticated anthropomorphic robotic arm with skill and speed approaching that of an able‐bodied person. This multiyear study exemplifies important principles in translating research from foundational theory and animal experiments into a clinical study. We present a roadmap that may serve as an example for other areas of clinical device research as well as an update on study results. Prior to conducting a multiyear clinical trial, years of animal research preceded BCI testing in an epilepsy monitoring unit, and then in a short‐term (28 days) clinical investigation. Scientists and engineers developed the necessary robotic and surgical hardware, software environment, data analysis techniques, and training paradigms. Coordination among researchers, funding institutes, and regulatory bodies ensured that the study would provide valuable scientific information in a safe environment for the study participant. Finally, clinicians from neurosurgery, anesthesiology, physiatry, psychology, and occupational therapy all worked in a multidisciplinary team along with the other researchers to conduct a multiyear BCI clinical study. This teamwork and coordination can be used as a model for others attempting to translate basic science into real‐world clinical situations.


systems, man and cybernetics | 2014

A collaborative BCI approach to autonomous control of a prosthetic limb system

Kapil D. Katyal; Matthew S. Johannes; Spencer Kellis; Tyson Aflalo; Christian Klaes; Timothy G. McGee; Matthew P. Para; Ying Shi; Brian Lee; Kelsie Pejsa; Charles Y. Liu; Brock A. Wester; Francesco Tenore; James D. Beaty; Alan D. Ravitz; Richard A. Andersen; Michael P. McLoughlin

Existing brain-computer interface (BCI) control of highly dexterous robotic manipulators and prosthetic devices typically rely solely on neural decode algorithms to determine the users intended motion. Although these approaches have made significant progress in the ability to control high degree of freedom (DOF) manipulators, the ability to perform activities of daily living (ADL) is still an ongoing research endeavor. In this paper, we describe a hybrid system that combines elements of autonomous robotic manipulation with neural decode algorithms to maneuver a highly dexterous robotic manipulator for a reach and grasp task. This system was demonstrated using a human patient with cortical micro-electrode arrays allowing the user to manipulate an object on a table and place it at a desired location. The preliminary results for this system are promising in that it demonstrates the potential to blend robotic control to perform lower level manipulation tasks with neural control that allows the user to focus on higher level tasks thereby reducing the cognitive load and increasing the success rate of performing ADL type activities.


intelligent robots and systems | 2014

Approaches to robotic teleoperation in a disaster scenario: From supervised autonomy to direct control

Kapil D. Katyal; Christopher Y. Brown; Steven A. Hechtman; Matthew P. Para; Timothy G. McGee; Kevin C. Wolfe; Ryan J. Murphy; Michael D. M. Kutzer; Edward Tunstel; Michael P. McLoughlin; Matthew S. Johannes

The ability of robotic systems to effectively address disaster scenarios that are potentially dangerous for human operators is continuing to grow as a research and development field. This leverages research from areas such as bimanual manipulation, dexterous grasping, bipedal locomotion, computer vision, sensing, object segmentation, varying degrees of autonomy, and operator control/feedback. This paper describes the development of a semi-autonomous bimanual dexterous robotic system that comes to the aid of a mannequin simulating an injured victim by operating a fire extinguisher, affixing a cervical collar, cooperatively placing the victim on a spineboard with another bimanual robot, and relocating the victim. This system accomplishes these tasks through a series of control modalities that range from supervised autonomy to full teleoperation and allows the control model to be chosen and optimized for a specific subtask. We present a description of the hardware platform, the software control architecture, a human-in-the-loop computer vision algorithm, and an infrastructure to use a variety of user input devices in combination with autonomous control to compete several dexterous tasks. The effectiveness of the system was demonstrated in both laboratory and live outdoor demonstrations.


international ieee/embs conference on neural engineering | 2013

HARMONIE: A multimodal control framework for human assistive robotics

Kapil D. Katyal; Matthew S. Johannes; Timothy G. McGee; Andrew J. Harris; Robert S. Armiger; Alex H. Firpi; David P. McMullen; Guy Hotson; Matthew S. Fifer; Nathan E. Crone; R. Jacob Vogelstein; Brock A. Wester

Effective user control of highly dexterous and robotic assistive devices requires intuitive and natural modalities. Although surgically implanted brain-computer interfaces (BCIs) strive to achieve this, a number of non-invasive engineering solutions may provide a quicker path to patient use by eliminating surgical implantation. We present the development of a semi-autonomous control system that utilizes computer vision, prosthesis feedback, effector centric device control, smooth movement trajectories, and appropriate hand conformations to interact with objects of interest. Users can direct a prosthetic limb through an intuitive graphical user interface to complete multi-stage tasks using patient appropriate combinations of control inputs such as eye tracking, conventional prosthetic controls/joysticks, surface electromyography (sEMG) signals, and neural interfaces (ECoG, EEG). Aligned with activities of daily living (ADL), these tasks include directing the prosthetic to specific locations or objects, grasping of objects by modulating hand conformation, and action upon grasped objects such as self-feeding. This Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE) semi-autonomous control system lowers the users cognitive load, leaving the bulk of command and control of the device to the computer. This flexible and intuitive control system could serve patient populations ranging from wheelchair-bound quadriplegics to upper-limb amputees.


systems, man and cybernetics | 2011

Development of a multi-modal haptic feedback system for dexterous robotic telemanipulation

Jenna L. Graham; Steven G. Manuel; Matthew S. Johannes; Robert S. Armiger

Robotic telemanipulation can be used to project human capabilities and accomplish tasks that may not otherwise be practical or safe for direct human involvement. Examples include work in inhospitable environments, the completion of dangerous tasks where the risk to human life is to be avoided such as explosive disablement, remote task completion when personnel transportation is not practical or desired, and completion of tasks that may be accomplished more successfully by human operated robots than by humans alone. These tasks would benefit from highly dexterous robotic telemanipulation capabilities with intuitive control and haptic feedback for the operator. A novel haptic feedback system that incorporates kinesthetic, vibratory, and tactile feedback has been proposed and is in development for use with a highly dexterous robotic platform to achieve human capabilities projection. This paper presents a novel wearable fingertip tactile feedback device that integrates with a commercially available haptic feedback system to provide multi-modal feedback for teleoperation of an highly dexterous anthropomorphic robotic manipulator.


Frontiers in Bioengineering and Biotechnology | 2013

Development of a Human Cranial Bone Surrogate for Impact Studies.

Jack C. Roberts; Andrew C. Merkle; Catherine M. Carneal; Liming M. Voo; Matthew S. Johannes; Jeff M. Paulson; Sara Tankard; O. Manny Uy

In order to replicate the fracture behavior of the intact human skull under impact it becomes necessary to develop a material having the mechanical properties of cranial bone. The most important properties to replicate in a surrogate human skull were found to be the fracture toughness and tensile strength of the cranial tables as well as the bending strength of the three-layer (inner table-diplöe-outer table) architecture of the human skull. The materials selected to represent the surrogate cranial tables consisted of two different epoxy resins systems with random milled glass fiber to enhance the strength and stiffness and the materials to represent the surrogate diplöe consisted of three low density foams. Forty-one three-point bending fracture toughness tests were performed on nine material combinations. The materials that best represented the fracture toughness of cranial tables were then selected and formed into tensile samples and tested. These materials were then used with the two surrogate diplöe foam materials to create the three-layer surrogate cranial bone samples for three-point bending tests. Drop tower tests were performed on flat samples created from these materials and the fracture patterns were very similar to the linear fractures in pendulum impacts of intact human skulls, previously reported in the literature. The surrogate cranial tables had the quasi-static fracture toughness and tensile strength of 2.5 MPa√ m and 53 ± 4.9 MPa, respectively, while the same properties of human compact bone were 3.1 ± 1.8 MPa√ m and 68 ± 18 MPa, respectively. The cranial surrogate had a quasi-static bending strength of 68 ± 5.7 MPa, while that of cranial bone was 82 ± 26 MPa. This material/design is currently being used to construct spherical shell samples for drop tower and ballistic tests.

Collaboration


Dive into the Matthew S. Johannes's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Guy Hotson

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge