Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeffrey A. Stoll is active.

Publication


Featured researches published by Jeffrey A. Stoll.


Medical Image Analysis | 2007

GPU Based Real-time Instrument Tracking with Three Dimensional Ultrasound

Paul M. Novotny; Jeffrey A. Stoll; Nikolay V. Vasilyev; Pedro J. del Nido; Pierre E. Dupont; Todd E. Zickler; Robert D. Howe

Real-time 3D ultrasound can enable new image-guided surgical procedures, but high data rates prohibit the use of traditional tracking techniques. We present a new method based on the modified Radon transform that identifies the axis of instrument shafts as bright patterns in planar projections. Instrument rotation and tip location are then determined using fiducial markers. These techniques are amenable to rapid execution on the current generation of personal computer graphics processor units (GPU). Our GPU implementation detected a surgical instrument in 31 ms, sufficient for real-time tracking at the 26 volumes per second rate of the ultrasound machine. A water tank experiment found instrument tip position errors of less than 0.2 mm, and an in vivo study tracked an instrument inside a beating porcine heart. The tracking results showed good correspondence to the actual movements of the instrument.


Computer Aided Surgery | 2003

Real-Time Three-Dimensional Ultrasound for Guiding Surgical Tasks

Jeremy W. Cannon; Jeffrey A. Stoll; Ivan S. Salgo; Heather Knowles; Robert D. Howe; Pierre E. Dupont; Gerald R. Marx; Pedro J. del Nido

Objective: As a stand-alone imaging modality, two-dimensional (2D) ultrasound (US) can only guide basic interventional tasks due to the limited spatial orientation information contained in these images. High-resolution real-time three-dimensional (3D) US can potentially overcome this limitation, thereby expanding the applications for US-guided procedures to include intracardiac surgery and fetal surgery, while potentially improving results of solid organ interventions such as image-guided breast, liver or prostate procedures. The following study examines the benefits of real-time 3D US for performing both basic and complex image-guided surgical tasks. Materials and Methods: Seven surgical trainees performed three tasks in an acoustic testing tank simulating an image-guided surgical environment using 2D US, biplanar 2D US, and 3D US for guidance. Surgeon-controlled US imaging was also tested. The evaluation tasks were (1) bead-in-hole navigation; (2) bead-to-bead navigation; and (3) clip fixation. Performance measures included completion time, tool tip trajectory, and error rates, with endoscope-guided performance serving as a gold-standard reference measure for each subject. Results: Compared to 2D US guidance, completion times decreased significantly with 3D US for both bead-in-hole navigation (50%, p = 0.046) and bead-to-bead navigation (77%, p = 0.009). Furthermore, tool-tip tracking for bead-to-bead navigation demonstrated improved navigational accuracy using 3D US versus 2D US (46%, p = 0.040). Biplanar 2D imaging and surgeon-controlled 2D US did not significantly improve performance as compared to conventional 2D US. In real-time 3D mode, surgeon-controlled imaging and changes in 3D image presentation made by adjusting the perspective of the 3D image did not diminish performance. For clip fixation, completion times proved excessive with 2D US guidance (< 240 s). However, with real-time 3D US imaging, completion times and error rates were comparable to endoscope-guided performance. Conclusions: Real-time 3D US can guide basic surgical tasks more efficiently and accurately than 2D US imaging. Real-time 3D US can also guide more complex surgical tasks which may prove useful for procedures where optical imaging is suboptimal, as in fetal surgery or intracardiac interventions.


international conference on robotics and automation | 2003

Port placement planning in robot-assisted coronary artery bypass

Jeremy W. Cannon; Jeffrey A. Stoll; Shaun Selha; Pierre E. Dupont; Robert D. Howe; David F. Torchiana

Properly selected port sites for robot-assisted coronary artery bypass graft (CABG) improve the efficiency and quality of these procedures. In clinical practice, surgeons select port locations using external anatomic landmarks to estimate a patients internal anatomy. This paper proposes an automated approach to port selection based on a preoperative image of the patient, thus avoiding the need to estimate internal anatomy. Using this image as input, port sites are chosen from a grid of surgeon-approved options by defining a performance measure for each possible port triad. This measure seeks to minimize the weighted squared deviation of the instrument and endoscope angles from their optimal orientations at each internal surgical site. This performance measure proves insensitive to perturbations in both its weighting factors and moderate intraoperative displacements of the patients internal anatomy. A validation study of this port site selection was performed. cardiac algorithm also Six surgeons dissected model vessels using the port triad selected by this algorithm with performance compared to dissection using a surgeon-selected port triad and a port triad template described by Tabaie et al., 1999. With the algorithm-selected ports, dissection speed increased by up to 43% (p = 0.046) with less overall vessel trauma. Thus, this algorithmic approach to port site selection has important clinical implications for robot-assisted CABG which warrant further investigation.


medical image computing and computer assisted intervention | 2003

A Navigation System for Augmenting Laparoscopic Ultrasound

James Ellsmere; Jeffrey A. Stoll; David W. Rattner; David M. Brooks; Robert A. Kane; William M. Wells; Ron Kikinis; Kirby G. Vosburgh

Establishing image context is the major difficulty of performing laparoscopic ultrasound. The standard techniques used by transabdominal ultrasonographers to understand image orientation are difficult to apply with laparoscopic instruments. In this paper, we describe a navigation system that displays the position and orientation of laparoscopic ultrasound images to the operating surgeon in real time. The display technique we developed for showing the orientation information uses a 3D model of the aorta as the main visual reference. This technique is helpful because it provides surgeons with important spatial cues, which we show improves their ability to interpret the laparoscopic ultrasound.


international conference on robotics and automation | 2007

Real-Time Visual Servoing of a Robot Using Three-Dimensional Ultrasound

Paul M. Novotny; Jeffrey A. Stoll; Pierre E. Dupont; Robert D. Howe

This paper presents a robotic system capable of using three-dimensional ultrasound to guide a surgical instrument to a tracked target location. Tracking of both the surgical instrument and target was done using image based algorithms on the real-time 3D ultrasound data. The tracking techniques are shown to be especially amenable for execution on powerful graphics processor units. By harnessing a graphics card, it was possible to detect both a surgical instrument and a surgical target at a rate of 25 Hz. The high update rate permits the use of tracked instrument and target locations for controlling a robot. Validation of the system was done in a water tank, where the robot moved the instrument to the target site with a mean error of 1.2 mm


IEEE Transactions on Medical Imaging | 2012

Passive Markers for Tracking Surgical Instruments in Real-Time 3-D Ultrasound Imaging

Jeffrey A. Stoll; Hongliang Ren; Pierre E. Dupont

A family of passive echogenic markers is presented by which the position and orientation of a surgical instrument can be determined in a 3-D ultrasound volume, using simple image processing. Markers are attached near the distal end of the instrument so that they appear in the ultrasound volume along with the instrument tip. They are detected and measured within the ultrasound image, thus requiring no external tracking device. This approach facilitates imaging instruments and tissue simultaneously in ultrasound-guided interventions. Marker-based estimates of instrument pose can be used in augmented reality displays or for image-based servoing. Design principles for marker shapes are presented that ensure imaging system and measurement uniqueness constraints are met. An error analysis is included that can be used to guide marker design and which also establishes a lower bound on measurement uncertainty. Finally, examples of marker measurement and tracking algorithms are presented along with experimental validation of the concepts.


medical image computing and computer assisted intervention | 2005

Passive markers for ultrasound tracking of surgical instruments

Jeffrey A. Stoll; Pierre E. Dupont

A family of passive markers is presented by which the position and orientation of a surgical instrument can be computed from its ultrasound image using simple image processing. These markers address the problem of imaging instruments and tissue simultaneously in ultrasound-guided interventions. Marker-based estimates of instrument location can be used in augmented reality displays or for image-based servoing. Marker design, measurement techniques and error analysis are presented. Experimentally determined in-vitro measurement errors of 0.22 mm in position and 0.089 rad in orientation were obtained using a standard ultrasound imaging system.


Current Problems in Surgery | 2009

Image Guided Surgical Interventions

Douglas P. Perrin; Nikolay V. Vasilyev; Paul M. Novotny; Jeffrey A. Stoll; Robert D. Howe; Pierre E. Dupont; Ivan S. Salgo; Pedro J. del Nido

urgeons have traditionally performed procedures to treat diseases by aining direct access to the internal structures involved, and using direct isual inspection to diagnose and treat the defects. Much effort has gone nto identifying the most appropriate incisions and approaches to enable ull access inside body cavities, specific organs, or musculoskeletal tructures. Imaging has traditionally been used primarily for preoperative iagnosis and at times for surgical planning. Intraoperative imaging, hen used, was meant to provide further diagnostic information or to ssess adequacy of repair. In most cases radiograph static images or uoroscopy have been used in the operating room. As the application of ess invasive procedures has progressed, other imaging technology has een applied in an effort to address the limitations of simple radiographs r fluoroscopy. Computed tomography (CT), magnetic resonance imagng (MRI), ultrasound, nuclear radiographic imaging, and modified ptical imaging have been introduced to provide the information required o plan and perform complex interventions inside the body without the eed for direct open visual inspection. In parallel with the developments in imaging modalities, endoscopic urgery has advanced with the introduction of rigid and flexible scopes quipped with video cameras to magnify and display the image obtained. dvances in optics and digital electronics have combined to provide nparalleled image quality even with small diameter scopes, resulting in n explosion of endoscopic procedures involving virtually every structure n the body. The only real limitation to imaging has been the inability o see or “image” through opaque structures, since the irradiating or illuminating” energy provided through the scope has been almost xclusively visible light. This limitation has confined endoscopic surgery o areas where a natural body cavity or physical space can be accessed ith a scope and instruments, and filled with nonopaque medium such as gas or clear fluid. Despite these limitations, optical endoscopy has evolutionized the way many surgical procedures are performed, and has pawned a whole industry of instrument manufacturers that, in conjunc-


international symposium on experimental robotics | 2000

Cooperative Human and Machine Perception in Teleoperated Assembly

Thomas Debus; Jeffrey A. Stoll; Robert D. Howe; Pierre E. Dupont

This paper presents results on a teleoperator expert assistant — a system that in cooperation with a human operator estimates properties of remote environment objects in order to improve task performance. Specifically, an undersea connector-mating task is investigated in the laboratory using a PHANToM master and WAM remote manipulator. Estimates of socket orientation are obtained during task performance and conveyed to the operator through a graphical display. Task performance, measured by completion time and peak insertion force, is compared for operators using combinations of video images, the graphical display and a shared control mode in which the connector automatically rotates to the estimated socket orientation. The graphical display and automatic orientation controller reduce task completion times and contact forces by over one-third for inclined sockets when the video signal is noisy, e.g., due to water turbidity.


medical image computing and computer assisted intervention | 2006

GPU based real-time instrument tracking with three dimensional ultrasound

Paul M. Novotny; Jeffrey A. Stoll; Nikolay V. Vasilyev; Pedro J. del Nido; Pierre E. Dupont; Robert D. Howe

Real-time three-dimensional ultrasound enables new intracardiac surgical procedures, but the distorted appearance of instruments in ultrasound poses a challenge to surgeons. This paper presents a detection technique that identifies the position of the instrument within the ultrasound volume. The algorithm uses a form of the generalized Radon transform to search for long straight objects in the ultrasound image, a feature characteristic of instruments and not found in cardiac tissue. When combined with passive markers placed on the instrument shaft, the full position and orientation of the instrument is found in 3D space. This detection technique is amenable to rapid execution on the current generation of personal computer graphics processor units (GPU). Our GPU implementation detected a surgical instrument in 31 ms, sufficient for real-time tracking at the 25 volumes per second rate of the ultrasound machine. A water tank experiment found instrument orientation errors of 1.1 degrees and tip position errors of less than 1.8mm. Finally, an in vivo study demonstrated successful instrument tracking inside a beating porcine heart.

Collaboration


Dive into the Jeffrey A. Stoll's collaboration.

Top Co-Authors

Avatar

Pierre E. Dupont

Boston Children's Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pedro J. del Nido

Boston Children's Hospital

View shared research outputs
Top Co-Authors

Avatar

Gerald R. Marx

Boston Children's Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeremy W. Cannon

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

John K. Triedman

Boston Children's Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kirby G. Vosburgh

Brigham and Women's Hospital

View shared research outputs
Researchain Logo
Decentralizing Knowledge