Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Patrick S. Jensen is active.

Publication


Featured researches published by Patrick S. Jensen.


international conference on robotics and automation | 2003

A miniature microsurgical instrument tip force sensor for enhanced force feedback during robot-assisted manipulation

Peter J. Berkelman; Louis L. Whitcomb; Russell H. Taylor; Patrick S. Jensen

This paper reports the development of a new miniature force sensor designed to measure contact forces at the tip of a microsurgical instrument in three dimensions, and its application to scaled force feedback using a cooperatively manipulated microsurgical assistant robot. The principal features of the sensor are its small size of 12.5 mm in diameter and 15 mm in height, a novel configuration of flexure beams and strain gauges in order to measure forces isotropically at the instrument tip 40 mm from the sensor body, and sub-mN three-axis force-sensing resolution.


international conference on robotics and automation | 2000

Preliminary experiments in cooperative human/robot force control for robot assisted microsurgical manipulation

Rajesh Kumar; Peter J. Berkelman; Puneet K. Gupta; Aaron Barnes; Patrick S. Jensen; Louis L. Whitcomb; Russell H. Taylor

Reports preliminary experiments with a robot system designed to cooperatively extend a humans ability to perform fine manipulation tasks requiring human judgement, sensory integration and hand-eye coordination. A completed steady-hand robot is reported. A stable force control law is reviewed. Preliminary experiments validate theoretical predictions of stable one-dimensional control of tool-tip forces in contact with both linearly and nonlinearly compliant objects. Preliminary feasibility experiments demonstrate stable one-dimensional robotic augmentation and force scaling of a human operators tactile input.


medical image computing and computer assisted intervention | 1999

Surgical Forces and Tactile Perception During Retinal Microsurgery

Puneet K. Gupta; Patrick S. Jensen; Eugene de Juan

Purpose: Vitreoretinal surgery involves the manipulation of delicate retinal membranes with a required surgical accuracy often on the order of tens of microns, a scale at or near the limit of human positional ability. In addition, forces imposed by the tissue on the surgical cool are exceedingly small. Here we investigate the magnitude of forces generated during retinal surgery in cadaveric porcine eyes and compare the results with the magnitude of forces discernable by retinal surgeons. This data will be used as a design guideline for robotic surgical augmentation systems currently under development Methods: The study was performed in two phases. First, retinal surgeons manipulated the retina of porcine cadaver eyes with a calibrated 1-axis force sensing retinal pick while data was simultaneously recorded. In the second phase, blindfolded subjects held the pick and were instructed to press a button whenever an “event” was felt. Events were generated by slowly tapping the end of the pick with varying force while both the magnitudes of forces applied and the responses of the subjects we recorded. The magnitudes of forces generated during retinal surgery were then compared with those that could be discerned by the subjects. Results: Roughly 75% of all forces measured during retinal microsurgery were found to be less than 7.5 mN in magnitude, however only 19.3 ± 8.1% (N=492) of events generated at this level could be felt by the subjects. Conclusions: The results of this study indicate that a majority of retinal surgery is probably performed without the surgeon being able to “feel” interactions between retinal tissue and the surgical tool. Prior studies have indicated that relying on visual feedback alone increases the length of manual manipulation tasks and reduces task accuracy. The lack of tactile sensation during retinal surgery similarly could adversely affect surgical outcome.


medical image computing and computer assisted intervention | 2000

A Miniature Instrument Tip Force Sensor for Robot/Human Cooperative Microsurgical Manipulation with Enhanced Force Feedback

Peter J. Berkelman; Louis L. Whitcomb; Russell H. Taylor; Patrick S. Jensen

This paper reports the development of a new miniature force sensor to measure forces at the tip of a microsurgical instrument in three dimensions with sub-millinewton resolution. This sensor will enable enhanced force feedback during surgical intervention in which a user directly manipulates surgical instruments cooperatively with a force-reflecting robot arm. This “steady-hand” scaled force interaction enables a surgeon to sense millinewton forces between the instrument and delicate body tissues during microsurgery which would otherwise be far below the threshold of human tactile sensing. The magnified force feedback can increase the dexterity of the surgeon and improve safety by preventing large damaging forces from being exerted by the instrument. The design and analysis of the new force sensor is presented with preliminary testing and force scaling control results.


medical image computing and computer assisted intervention | 1999

A Steady-Hand Robotic System for Microsurgical Augmentation

Russell H. Taylor; Patrick S. Jensen; Louis L. Whitcomb; Aaron Barnes; Rajesh Kumar; Dan Stoianovici; Puneet K. Gupta; Zhengxian Wang; Eugene de Juan; Louis R. Kavoussi

This paper reports the development of a robotic system designed to extend a human’s ability to perform small-scale (sub-millimeter) manipulation tasks requiring human judgement, sensory integration and hand-eye coordination. Our novel approach, which we call “steady hand” micromanipulation, is for tools to be held simultaneously both by the operator’s hand and a specially designed actively controlled robot arm. The robot’s controller senses forces exerted by the operator on the tool and by the tool on the environment, and uses this information in various control modes to provide smooth, tremor-free precise positional control and force scaling. Our goal is to develop a manipulation system with the precision and sensitivity of a machine, but with the manipulative transparency and immediacy of handheld tools for tasks characterized by compliant or semi-rigid contacts with the environment.


robot and human interactive communication | 1999

Experiments with a steady hand robot in constrained compliant motion and path following

Rajesh Kumar; Patrick S. Jensen; Russell H. Taylor

We consider the problem of cooperative manipulation to improve positioning and path following abilities of humans. Using a specially designed actuated manipulator and steady hand manipulation we report on compliant path following strategies and their experimental evaluation. Detecting lines and simple curves by processing images from an endoscope mounted on the robot, we traverse these curves autonomously, under direct user control, and in an augmented mode of user control. Anisotropic gains based on gradient information from the imaging reduce errors in path traversal.


medical image computing and computer assisted intervention | 2000

An Augmentation System for Fine Manipulation

Rajesh Kumar; Gregory D. Hager; Aaron Barnes; Patrick S. Jensen; Russell H. Taylor

Augmented surgical manipulation tasks can be viewed as a sequence of smaller, simpler steps driven primarily by the surgeon’s input. These steps can be abstracted as controlled interaction of the tool/end-effector with the environment. The basic research problem here is performing a sequence of control primitives. In computing terms, each of the primitives is a predefined computational routine (e.g. compliant motion or some other “macro”) with initiation and termination predicates. The sequencing of these primitives depends upon user control and effects of the environmental interaction. We explore a sensor driven system to perform simple manipulation tasks. The system is composed of a core set of “safe” system states and task specific states and transitions. Using the “steady hand” robot as the experimental platform we investigate using such a system.


medical image computing and computer assisted intervention | 2001

Performance Evaluation of a Cooperative Manipulation Microsurgical Assistant Robot Applied to Stapedotomy

Peter J. Berkelman; Daniel L. Rothbaum; Jaydeep Roy; Samuel Lang; Louis L. Whitcomb; Gregory D. Hager; Patrick S. Jensen; Eugene de Juan; Russell H. Taylor; John K. Niparko

This paper reports the development of a full-scale instrumented model of the human ear that permits quantitative evaluation of the utility of a microsurgical assistant robot in the surgical procedure of stapedotomy.


medical image computing and computer assisted intervention | 1999

Performance of Robotic Augmentation in Microsurgery-Scale Motions

Rajesh Kumar; Tushar M. Goradia; Aaron Barnes; Patrick S. Jensen; Louis L. Whitcomb; Dan Stoianovici; Ludwig M. Auer; Russell H. Taylor

This paper is part of the development process of a microsurgical “cooperating” assistant. To evaluate its applicability to augment fine surgical motions, we test precision and operator perception in simple microsurgical scale pick and place motions. Such motions are common in microsurgical procedures (e.g. micro-vascular anastomosis). The experiments test the users’ ability to position a common surgical tool to 250, 200 and 150 micrometer accuracy. These experiments were performed using two test platforms. The new “steady hand” robot designed for microsurgery and the LARS robot (a laparoscopic camera holding robot) adapted for this purpose. Comparative results for several parameters including time, success rate, error rate, number of attempts are included. Comparison of performance of the two robots for these tasks is also included. The results support our claim that the new “steady hand” robot augments human performance for microsurgery-scale motion.


international conference of the ieee engineering in medicine and biology society | 2000

A study of instrument motion in retinal microsurgery

Cameron N. Riviere; Patrick S. Jensen

Reports on high-precision recordings of hand-held instrument motion during actual vitreoretinal microsurgery. The movement of a hand-held instrument during vitreoretinal microsurgery was recorded in six degrees of freedom. Data were acquired for 5 min using an inertial sensing module that has been developed for use with a commercially available microsurgical instrument. Maximum velocity used by the surgeon was estimated at 0.70 m/s, and maximum acceleration at 30.1 m/s/sup 2/. The rms amplitude of tremor in the instrument tip motion was estimated to be 0.182 mm.

Collaboration


Dive into the Patrick S. Jensen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eugene de Juan

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Aaron Barnes

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rajesh Kumar

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eugene Dejuan

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge