Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthew P. Para is active.

Publication


Featured researches published by Matthew P. Para.


Journal of Neural Engineering | 2016

Individual finger control of a modular prosthetic limb using high-density electrocorticography in a human subject

Guy Hotson; David P. McMullen; Matthew S. Fifer; Matthew S. Johannes; Kapil D. Katyal; Matthew P. Para; Robert S. Armiger; William S. Anderson; Nitish V. Thakor; Brock A. Wester; Nathan E. Crone

OBJECTIVE We used native sensorimotor representations of fingers in a brain-machine interface (BMI) to achieve immediate online control of individual prosthetic fingers. APPROACH Using high gamma responses recorded with a high-density electrocorticography (ECoG) array, we rapidly mapped the functional anatomy of cued finger movements. We used these cortical maps to select ECoG electrodes for a hierarchical linear discriminant analysis classification scheme to predict: (1) if any finger was moving, and, if so, (2) which digit was moving. To account for sensory feedback, we also mapped the spatiotemporal activation elicited by vibrotactile stimulation. Finally, we used this prediction framework to provide immediate online control over individual fingers of the Johns Hopkins University Applied Physics Laboratory modular prosthetic limb. MAIN RESULTS The balanced classification accuracy for detection of movements during the online control session was 92% (chance: 50%). At the onset of movement, finger classification was 76% (chance: 20%), and 88% (chance: 25%) if the pinky and ring finger movements were coupled. Balanced accuracy of fully flexing the cued finger was 64%, and 77% had we combined pinky and ring commands. Offline decoding yielded a peak finger decoding accuracy of 96.5% (chance: 20%) when using an optimized selection of electrodes. Offline analysis demonstrated significant finger-specific activations throughout sensorimotor cortex. Activations either prior to movement onset or during sensory feedback led to discriminable finger control. SIGNIFICANCE Our results demonstrate the ability of ECoG-based BMIs to leverage the native functional anatomy of sensorimotor cortical populations to immediately control individual finger movements in real time.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2014

Simultaneous Neural Control of Simple Reaching and Grasping With the Modular Prosthetic Limb Using Intracranial EEG

Matthew S. Fifer; Guy Hotson; Brock A. Wester; David P. McMullen; Yujing Wang; Matthew S. Johannes; Kapil D. Katyal; John B. Helder; Matthew P. Para; R. Jacob Vogelstein; William S. Anderson; Nitish V. Thakor; Nathan E. Crone

Intracranial electroencephalographic (iEEG) signals from two human subjects were used to achieve simultaneous neural control of reaching and grasping movements with the Johns Hopkins University Applied Physics Lab (JHU/APL) Modular Prosthetic Limb (MPL), a dexterous robotic prosthetic arm. We performed functional mapping of high gamma activity while the subject made reaching and grasping movements to identify task-selective electrodes. Independent, online control of reaching and grasping was then achieved using high gamma activity from a small subset of electrodes with a model trained on short blocks of reaching and grasping with no further adaptation. Classification accuracy did not decline (p <; 0.05, one-way ANOVA) over three blocks of testing in either subject. Mean classification accuracy during independently executed overt reach and grasp movements for (Subject 1, Subject 2) were (0.85, 0.81) and (0.80, 0.96), respectively, and during simultaneous execution they were (0.83, 0.88) and (0.58, 0.88), respectively. Our models leveraged knowledge of the subjects individual functional neuroanatomy for reaching and grasping movements, allowing rapid acquisition of control in a time-sensitive clinical setting. We demonstrate the potential feasibility of verifying functionally meaningful iEEG-based control of the MPL prior to chronic implantation, during which additional capabilities of the MPL might be exploited with further training.


Clinical and Translational Science | 2014

Collaborative approach in the development of high-performance brain-computer interfaces for a neuroprosthetic arm: Translation from animal models to human control

Jennifer L. Collinger; Michael Kryger; Richard Barbara; Timothy Betler; Kristen Bowsher; Elke H.P. Brown; Samuel T. Clanton; Alan D. Degenhart; Stephen T. Foldes; Robert A. Gaunt; Ferenc Gyulai; Elizabeth A. Harchick; Deborah L. Harrington; John B. Helder; Timothy Hemmes; Matthew S. Johannes; Kapil D. Katyal; Geoffrey S. F. Ling; Angus J. C. McMorland; Karina Palko; Matthew P. Para; Janet Scheuermann; Andrew B. Schwartz; Elizabeth R. Skidmore; Florian Solzbacher; Anita V. Srikameswaran; Dennis P. Swanson; Scott Swetz; Elizabeth C. Tyler-Kabara; Meel Velliste

Our research group recently demonstrated that a person with tetraplegia could use a brain–computer interface (BCI) to control a sophisticated anthropomorphic robotic arm with skill and speed approaching that of an able‐bodied person. This multiyear study exemplifies important principles in translating research from foundational theory and animal experiments into a clinical study. We present a roadmap that may serve as an example for other areas of clinical device research as well as an update on study results. Prior to conducting a multiyear clinical trial, years of animal research preceded BCI testing in an epilepsy monitoring unit, and then in a short‐term (28 days) clinical investigation. Scientists and engineers developed the necessary robotic and surgical hardware, software environment, data analysis techniques, and training paradigms. Coordination among researchers, funding institutes, and regulatory bodies ensured that the study would provide valuable scientific information in a safe environment for the study participant. Finally, clinicians from neurosurgery, anesthesiology, physiatry, psychology, and occupational therapy all worked in a multidisciplinary team along with the other researchers to conduct a multiyear BCI clinical study. This teamwork and coordination can be used as a model for others attempting to translate basic science into real‐world clinical situations.


systems, man and cybernetics | 2011

Revolutionizing Prosthetics software technology

Andrew Harris; Kapil D. Katyal; Matthew P. Para; Justin Thomas

The Defense Advanced Research Projects Agency Revolutionizing Prosthetics 2009 program has the goal of developing a neurally-integrated upper limb prosthesis to provide amputee soldiers with pre-injury levels of function. A team of over 30 collaborating institutions from industry and academia set out to design and construct a prosthetic limb that achieves natural human limb performance by combining the state of the art in neuroscience, robotics, sensors, power systems, actuation, and complex embedded software. The Modular Prosthetic Limb (MPL) is the result of Phase II of the program which successfully demonstrated robotic limb operation in December 2009. MPL design and development addressed several areas of technical complexity such as space, power, and mass constraints, safety-critical operation, and modularity for patient customizations based on the location of disarticulation. This paper provides an overview of the MPL system architecture, the robotic arm control strategy, and detailed information on the embedded system software that integrates the MPL components to enable patient-based control.


systems, man and cybernetics | 2014

A collaborative BCI approach to autonomous control of a prosthetic limb system

Kapil D. Katyal; Matthew S. Johannes; Spencer Kellis; Tyson Aflalo; Christian Klaes; Timothy G. McGee; Matthew P. Para; Ying Shi; Brian Lee; Kelsie Pejsa; Charles Y. Liu; Brock A. Wester; Francesco Tenore; James D. Beaty; Alan D. Ravitz; Richard A. Andersen; Michael P. McLoughlin

Existing brain-computer interface (BCI) control of highly dexterous robotic manipulators and prosthetic devices typically rely solely on neural decode algorithms to determine the users intended motion. Although these approaches have made significant progress in the ability to control high degree of freedom (DOF) manipulators, the ability to perform activities of daily living (ADL) is still an ongoing research endeavor. In this paper, we describe a hybrid system that combines elements of autonomous robotic manipulation with neural decode algorithms to maneuver a highly dexterous robotic manipulator for a reach and grasp task. This system was demonstrated using a human patient with cortical micro-electrode arrays allowing the user to manipulate an object on a table and place it at a desired location. The preliminary results for this system are promising in that it demonstrates the potential to blend robotic control to perform lower level manipulation tasks with neural control that allows the user to focus on higher level tasks thereby reducing the cognitive load and increasing the success rate of performing ADL type activities.


intelligent robots and systems | 2014

Approaches to robotic teleoperation in a disaster scenario: From supervised autonomy to direct control

Kapil D. Katyal; Christopher Y. Brown; Steven A. Hechtman; Matthew P. Para; Timothy G. McGee; Kevin C. Wolfe; Ryan J. Murphy; Michael D. M. Kutzer; Edward Tunstel; Michael P. McLoughlin; Matthew S. Johannes

The ability of robotic systems to effectively address disaster scenarios that are potentially dangerous for human operators is continuing to grow as a research and development field. This leverages research from areas such as bimanual manipulation, dexterous grasping, bipedal locomotion, computer vision, sensing, object segmentation, varying degrees of autonomy, and operator control/feedback. This paper describes the development of a semi-autonomous bimanual dexterous robotic system that comes to the aid of a mannequin simulating an injured victim by operating a fire extinguisher, affixing a cervical collar, cooperatively placing the victim on a spineboard with another bimanual robot, and relocating the victim. This system accomplishes these tasks through a series of control modalities that range from supervised autonomy to full teleoperation and allows the control model to be chosen and optimized for a specific subtask. We present a description of the hardware platform, the software control architecture, a human-in-the-loop computer vision algorithm, and an infrastructure to use a variety of user input devices in combination with autonomous control to compete several dexterous tasks. The effectiveness of the system was demonstrated in both laboratory and live outdoor demonstrations.


Archive | 2017

Brain-Machine Interface Development for Finger Movement Control

Tessy M. Lal; Guy Hotson; Matthew S. Fifer; David P. McMullen; Matthew S. Johannes; Kapil D. Katyal; Matthew P. Para; Robert S. Armiger; William S. Anderson; Nitish V. Thakor; Brock A. Wester; Nathan E. Crone

There have been many developments in brain-machine interfaces (BMI) for controlling upper limb movements such as reaching and grasping. One way to expand the usefulness of BMIs in replacing motor functions for patients with spinal cord injuries and neuromuscular disorders would be to improve the dexterity of upper limb movements performed by including more control of individual finger movements. Many studies have been focusing on understanding the organization of movement control in the sensorimotor cortex of the human brain. Finding the specific mechanisms for neural control of different movements will help focus signal acquisition and processing so as to improve BMI control of complex actions. In a recently published study, we demonstrated, for the first time, online BMI control of individual finger movements using electrocorticography recordings from the hand area of sensorimotor cortex. This study expands the possibilities for combined control of arm movements and more dexterous hand and finger movements.


systems, man and cybernetics | 2016

Nested marsupial robotic system for search and sampling in increasingly constrained environments

Joseph O. Moore; Kevin C. Wolfe; Matthew S. Johannes; Kapil D. Katyal; Matthew P. Para; Ryan J. Murphy; Jessica Hatch; Colin J. Taylor; Robert J. Bamberger; Edward Tunstel

This paper presents a nested marsupial robotic system and its execution of a notional disaster response task. Human supervised autonomy is facilitated by tightly-coupled, high-level user feedback enabling command and control of a bimanual mobile manipulator carrying a quadrotor unmanned aerial vehicle that carries a miniature ground robot. Each robot performs a portion of a mock hazardous chemical spill investigation and sampling task within a shipping container. This work offers an example application for a heterogeneous team of robots that could directly support first responder activities using complementary capabilities of autonomous dexterous manipulation and mobility, autonomous planning and control, and teleoperation. The task was successfully executed during multiple live trials at the DARPA Robotics Challenge Technology Expo in June 2015. A key contribution of the work is the application of a unified algorithmic approach to autonomous planning, control, and estimation supporting vision-based manipulation and non-GPS-based ground and aerial mobility, thus reducing algorithmic complexity across this capability set. The unified algorithmic approach is described along with the robot capabilities, hardware implementations, and human interface, followed by discussion of live demonstration execution and results.


systems, man and cybernetics | 2014

Demonstration of force feedback control on the Modular Prosthetic Limb

Timothy G. McGee; Matthew P. Para; Kapil D. Katyal; Matthew S. Johannes

The Johns Hopkins University Applied Physics Laboratory (APL) led the development of the Modular Prosthetic Limb (MPL), an advanced upper-extremity prosthetic limb, under the DARPA Revolutionizing Prosthetics program. In addition to its use as an advanced prosthetic, APL is also exploring more advanced autonomy including closed-loop force feedback control with the MPL. This short paper provides an overview of the MPL system and summarizes two demonstrations performed at APL using fingertip force sensors to perform autonomous writing and fingertip grasping.


intelligent robots and systems | 2013

Experimental validation of imposed safety regions for neural controlled human patient self-feeding using the modular prosthetic limb

Brock A. Wester; Matthew P. Para; Ashok Sivakumar; Michael D. M. Kutzer; Kapil D. Katyal; Alan Ravitz; James D. Beaty; Michael P. McLoughlin; Matthew S. Johannes

This paper presents the experimental validation of software-based safety features implemented during the control of a prosthetic limb in self-feeding tasks with a human patient. To ensure safe operation during patient controlled movements of the limb, velocity-based virtual fixtures are constructed with respect to the patients location and orientation relative to the limb. These imposed virtual fixtures or safety zones modulate the allowable movement direction and speed of the limb to ensure patient safety during commanded limb trajectories directed toward the patients body or environmental obstacles. In this implementation, the Modular Prosthetic Limb (MPL) will be controlled by a quadriplegic patient using implanted intracortical electrodes. These virtual fixtures leverage existing sensors internal to the MPL and operate in conjunction with the existing limb control. Validation of the virtual fixtures was conducted by executing a recorded set of limb control inputs while collecting both direct feedback from the limb sensors and ground truth measurements of the limb configuration using a Vicon tracking system. Analysis of the collected data indicates that the system performed within the limitations prescribed by the imposed virtual fixtures. This successful implementation and validation enabled the approved clinical use of the MPL system for a neural controlled self-feeding task.

Collaboration


Dive into the Matthew P. Para's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nitish V. Thakor

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Guy Hotson

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Kevin C. Wolfe

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge