Michael P. McLoughlin
Johns Hopkins University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michael P. McLoughlin.
IEEE Transactions on Acoustics, Speech, and Signal Processing | 1987
Gonzalo R. Arce; Michael P. McLoughlin
Median filtering has been used successfully for extracting features from noisy one-dimensional signals; however, the extension of the one-dimensional case to higher dimensions has not always yielded satisfactory results. Although noise suppression is obtained, too much signal distortion is introduced and many features of interest are lost. In this paper, we introduce a multidimensional filter based on a combination of one-dimensional median estimates. It is shown that threshold decomposition holds for this class of filters, making the deterministic analysis simpler. Invariant signals to the filter, called root signals, consist of very low resolution features making this filter much more attractive than conventional median filters.
IEEE Transactions on Acoustics, Speech, and Signal Processing | 1987
Michael P. McLoughlin; Gonzalo R. Arce
The recursive separable median filter has been successfully used to extract features from noisy two-dimensional signals. In many applications, it gives better noise suppression and edge preservation than the standard separable median filter. In this paper we use a new approach for studying the deterministic properties of separable median filters. In particular, using threshold decomposition, we derive the root structure of the recursive separable median filter, where a root is a signal invariant to further filtering. It is shown that these root structures differ from those of their nonrecursive counterparts. We also show that any two-dimensional signal will converge to a root after repeated passes of the recursive separable median filter.
systems, man and cybernetics | 2014
Kapil D. Katyal; Matthew S. Johannes; Spencer Kellis; Tyson Aflalo; Christian Klaes; Timothy G. McGee; Matthew P. Para; Ying Shi; Brian Lee; Kelsie Pejsa; Charles Y. Liu; Brock A. Wester; Francesco Tenore; James D. Beaty; Alan D. Ravitz; Richard A. Andersen; Michael P. McLoughlin
Existing brain-computer interface (BCI) control of highly dexterous robotic manipulators and prosthetic devices typically rely solely on neural decode algorithms to determine the users intended motion. Although these approaches have made significant progress in the ability to control high degree of freedom (DOF) manipulators, the ability to perform activities of daily living (ADL) is still an ongoing research endeavor. In this paper, we describe a hybrid system that combines elements of autonomous robotic manipulation with neural decode algorithms to maneuver a highly dexterous robotic manipulator for a reach and grasp task. This system was demonstrated using a human patient with cortical micro-electrode arrays allowing the user to manipulate an object on a table and place it at a desired location. The preliminary results for this system are promising in that it demonstrates the potential to blend robotic control to perform lower level manipulation tasks with neural control that allows the user to focus on higher level tasks thereby reducing the cognitive load and increasing the success rate of performing ADL type activities.
intelligent robots and systems | 2014
Kapil D. Katyal; Christopher Y. Brown; Steven A. Hechtman; Matthew P. Para; Timothy G. McGee; Kevin C. Wolfe; Ryan J. Murphy; Michael D. M. Kutzer; Edward Tunstel; Michael P. McLoughlin; Matthew S. Johannes
The ability of robotic systems to effectively address disaster scenarios that are potentially dangerous for human operators is continuing to grow as a research and development field. This leverages research from areas such as bimanual manipulation, dexterous grasping, bipedal locomotion, computer vision, sensing, object segmentation, varying degrees of autonomy, and operator control/feedback. This paper describes the development of a semi-autonomous bimanual dexterous robotic system that comes to the aid of a mannequin simulating an injured victim by operating a fire extinguisher, affixing a cervical collar, cooperatively placing the victim on a spineboard with another bimanual robot, and relocating the victim. This system accomplishes these tasks through a series of control modalities that range from supervised autonomy to full teleoperation and allows the control model to be chosen and optimized for a specific subtask. We present a description of the hardware platform, the software control architecture, a human-in-the-loop computer vision algorithm, and an infrastructure to use a variety of user input devices in combination with autonomous control to compete several dexterous tasks. The effectiveness of the system was demonstrated in both laboratory and live outdoor demonstrations.
Experimental Neurology | 2017
Michael Kryger; Brock A. Wester; Eric A. Pohlmeyer; Matthew Rich; Brendan John; James D. Beaty; Michael P. McLoughlin; Michael L. Boninger; Elizabeth C. Tyler-Kabara
&NA; As Brain‐Computer Interface (BCI) systems advance for uses such as robotic arm control it is postulated that the control paradigms could apply to other scenarios, such as control of video games, wheelchair movement or even flight. The purpose of this pilot study was to determine whether our BCI system, which involves decoding the signals of two 96‐microelectrode arrays implanted into the motor cortex of a subject, could also be used to control an aircraft in a flight simulator environment. The study involved six sessions in which various parameters were modified in order to achieve the best flight control, including plane type, view, control paradigm, gains, and limits. Successful flight was determined qualitatively by evaluating the subjects ability to perform requested maneuvers, maintain flight paths, and avoid control losses such as dives, spins and crashes. By the end of the study, it was found that the subject could successfully control an aircraft. The subject could use both the jet and propeller plane with different views, adopting an intuitive control paradigm. From the subjects perspective, this was one of the most exciting and entertaining experiments she had performed in two years of research. In conclusion, this study provides a proof‐of‐concept that traditional motor cortex signals combined with a decoding paradigm can be used to control systems besides a robotic arm for which the decoder was developed. Aside from possible functional benefits, it also shows the potential for a new recreational activity for individuals with disabilities who are able to master BCI control. HighlightsA Brain‐Computer Interface controlled flight simulator is tested in a pilot study.The subject successfully controlled aircrafts using signals from her motor cortex.She learned to control the system with different planes, views, and locations.This system could potentially be used for transport control or recreation.
World Neurosurgery | 2014
Brian Lee; Frank J. Attenello; Charles Y. Liu; Michael P. McLoughlin; Michael L.J. Apuzzo
With the loss of function of an upper extremity because of stroke or spinal cord injury or a physical loss from amputation, an individuals life is forever changed, and activities that were once routine become a magnitude more difficult. Much research and effort have been put into developing advanced robotic prostheses to restore upper extremity function. For patients with upper extremity amputations, previously crude prostheses have evolved to become exceptionally functional. Because the upper extremities can perform a wide variety of activities, several types of upper extremity prostheses are available ranging from passive cosmetic limbs to externally powered robotic limbs. In addition, new developments in brain-machine interface are poised to revolutionize how patients can control these advanced prostheses using their thoughts alone. For patients with spinal cord injury or stroke, functional electrical stimulation promises to provide the most sophisticated prosthetic limbs possible by reanimating paralyzed arms of these patients. Advances in technology and robotics continue to help patients recover vital function. This article examines the latest neurorestorative technologies for patients who have either undergone amputation or lost the use of their upper extremities secondary to stroke or spinal cord injury.
Proceedings of SPIE | 2017
Eric A. Pohlmeyer; Matthew S. Fifer; Matthew Rich; Johnathan Pino; Brock A. Wester; Matthew S. Johannes; Chris Dohopolski; John B. Helder; Denise D'Angelo; James D. Beaty; Sliman J. Bensmaia; Michael P. McLoughlin; Francesco Tenore
Brain-computer interface (BCI) research has progressed rapidly, with BCIs shifting from animal tests to human demonstrations of controlling computer cursors and even advanced prosthetic limbs, the latter having been the goal of the Revolutionizing Prosthetics (RP) program. These achievements now include direct electrical intracortical microstimulation (ICMS) of the brain to provide human BCI users feedback information from the sensors of prosthetic limbs. These successes raise the question of how well people would be able to use BCIs to interact with systems that are not based directly on the body (e.g., prosthetic arms), and how well BCI users could interpret ICMS information from such devices. If paralyzed individuals could use BCIs to effectively interact with such non-anthropomorphic systems, it would offer them numerous new opportunities to control novel assistive devices. Here we explore how well a participant with tetraplegia can detect infrared (IR) sources in the environment using a prosthetic arm mounted camera that encodes IR information via ICMS. We also investigate how well a BCI user could transition from controlling a BCI based on prosthetic arm movements to controlling a flight simulator, a system with different physical dynamics than the arm. In that test, the BCI participant used environmental information encoded via ICMS to identify which of several upcoming flight routes was the best option. For both tasks, the BCI user was able to quickly learn how to interpret the ICMSprovided information to achieve the task goals.
Archive | 2015
David P. McMullen; Matthew S. Fifer; Brock A. Wester; Guy Hotson; Kapil D. Katyal; Matthew S. Johannes; Timothy G. McGee; Andrew L. Harris; Alan Ravitz; Michael P. McLoughlin; William S. Anderson; Nitish V. Thakor; Nathan E. Crone
Although advanced prosthetic limbs, such as the modular prosthetic limb (MPL), are now capable of mimicking the dexterity of human limbs, brain-machine interfaces (BMIs) are not yet able to take full advantage of their capabilities. To improve BMI control of the MPL, we are developing a semi-autonomous system, the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system is designed to utilize novel control strategies including hybrid input (adding eye tracking to neural control), supervisory control (decoding high-level patient goals), and intelligent robotics (incorporating computer vision and route planning algorithms). Patients use eye gaze to indicate a desired object that has been recognized by computer vision. They then perform a desired action, such as reaching and grasping, which is decoded and carried out by the MPL via route planning algorithms. Here we present two patients, implanted with electrocorticography (ECoG) and depth electrodes, who controlled the HARMONIE system to perform reach and grasping tasks; in addition, one patient also used the HARMONIE system to simulate self-feeding. This work builds upon prior research to demonstrate the feasibility of using novel control strategies to enable patients to perform a wider variety of activities of daily living (ADLs).
intelligent robots and systems | 2013
Brock A. Wester; Matthew P. Para; Ashok Sivakumar; Michael D. M. Kutzer; Kapil D. Katyal; Alan Ravitz; James D. Beaty; Michael P. McLoughlin; Matthew S. Johannes
This paper presents the experimental validation of software-based safety features implemented during the control of a prosthetic limb in self-feeding tasks with a human patient. To ensure safe operation during patient controlled movements of the limb, velocity-based virtual fixtures are constructed with respect to the patients location and orientation relative to the limb. These imposed virtual fixtures or safety zones modulate the allowable movement direction and speed of the limb to ensure patient safety during commanded limb trajectories directed toward the patients body or environmental obstacles. In this implementation, the Modular Prosthetic Limb (MPL) will be controlled by a quadriplegic patient using implanted intracortical electrodes. These virtual fixtures leverage existing sensors internal to the MPL and operate in conjunction with the existing limb control. Validation of the virtual fixtures was conducted by executing a recorded set of limb control inputs while collecting both direct feedback from the limb sensors and ground truth measurements of the limb configuration using a Vicon tracking system. Analysis of the collected data indicates that the system performed within the limitations prescribed by the imposed virtual fixtures. This successful implementation and validation enabled the approved clinical use of the MPL system for a neural controlled self-feeding task.
Proceedings of SPIE | 2017
Clara A. Scholl; Scott M. Hendrickson; Bruce Swett; Michael J. Fitch; Erich C. Walter; Michael P. McLoughlin; Mark A. Chevillet; David W. Blodgett; Grace M. Hwang
The development of portable non-invasive brain computer interface technologies with higher spatio-temporal resolution has been motivated by the tremendous success seen with implanted devices. This talk will discuss efforts to overcome several major obstacles to viability including approaches that promise to improve spatial and temporal resolution. Optical approaches in particular will be highlighted and the potential benefits of both Blood-Oxygen Level Dependent (BOLD) and Fast Optical Signal (FOS) will be discussed. Early-stage research into the correlations between neural activity and FOS will be explored.