Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Timothy D. Soper is active.

Publication


Featured researches published by Timothy D. Soper.


Journal of Biophotonics | 2010

Scanning fiber endoscopy with highly flexible, 1-mm catheterscopes for wide-field, full-color imaging

Cameron M. Lee; Christoph J. Engelbrecht; Timothy D. Soper; Fritjof Helmchen; Eric J. Seibel

In modern endoscopy, wide field of view and full color are considered necessary for navigating inside the body, inspecting tissue for disease and guiding interventions such as biopsy or surgery. Current flexible endoscope technologies suffer from reduced resolution when device diameter shrinks. Endoscopic procedures today, using coherent fiber-bundle technology on the scale of 1 mm, are performed with such poor image quality that the clinicians vision meets the criteria for legal blindness. Here, we review a new and versatile scanning fiber-imaging technology and describe its implementation for ultrathin and flexible endoscopy. This scanning fiber endoscope (SFE) or catheterscope enables high-quality, laser-based, video imaging for ultrathin clinical applications, while also providing new options for in vivo biological research of subsurface tissue and high resolution fluorescence imaging.


IEEE Transactions on Biomedical Engineering | 2010

In Vivo Validation of a Hybrid Tracking System for Navigation of an Ultrathin Bronchoscope Within Peripheral Airways

Timothy D. Soper; David R. Haynor; Robb W. Glenny; Eric J. Seibel

Transbronchial biopsy of peripheral lung nodules is hindered by the inability to access lesions endoluminally due to the large diameter of conventional bronchoscopes. An ultrathin scanning fiber bronchoscope has recently been developed to advance image-guided biopsy several branching generations deeper into the peripheral airways. However, navigating a potentially complex 3-D path to the region of interest presents a challenge to the bronchoscopist. An accompanying guidance system has also been developed to track the bronchoscope through the airways, and display its position and intended path on a virtual display. Intraoperative localization of the bronchoscope was achieved by combining electromagnetic tracking (EMT) and image-based tracking (IBT). An error-state Kalman filter was used to model the disagreement between the two tracking sources. The positional tracking error was reduced from 14.22 and 14.92 mm by independent EMT and IBT, respectively, to 6.74 mm using the hybrid approach. Hybrid tracking of the scope orientation and respiratory motion compensation further improved tracking accuracy and stability, resulting in an average tracking error of 3.33 mm and 10.01°.


IEEE Transactions on Biomedical Engineering | 2012

Surface Mosaics of the Bladder Reconstructed From Endoscopic Video for Automated Surveillance

Timothy D. Soper; Michael P. Porter; Eric J. Seibel

Flexible cystoscopy is frequently performed for recurrent bladder cancer surveillance, making it the most expensive cancer to treat over the patients lifetime. An automated bladder surveillance system is being developed to robotically scan the bladder surface using an ultrathin and highly flexible endoscope. Such a system would allow cystoscopic procedures to be overseen by technical staff while urologists could review cystoscopic video postoperatively. In this paper, we demonstrate a method for reconstructing the surface of the whole bladder from endoscopic video using structure from motion. Video is acquired from a custom ultrathin and highly flexible endoscope that can retroflex to image the entire internal surface of the bladder. Selected frames are subsequently stitched into a mosaic and mapped to a reconstructed surface, creating a 3-D surface model of the bladder that can be expediently reviewed. Our software was tested on endoscopic video of an excised pig bladder. The resulting reconstruction possessed a projection error of 1.66 pixels on average and covered 99.6% of the bladder surface area.


Proceedings of SPIE | 2011

Constructing spherical panoramas of a bladder phantom from endoscopic video using bundle adjustment

Timothy D. Soper; John E. Chandler; Michael P. Porter; Eric J. Seibel

The high recurrence rate of bladder cancer requires patients to undergo frequent surveillance screenings over their lifetime following initial diagnosis and resection. Our laboratory is developing panoramic stitching software that would compile several minutes of cystoscopic video into a single panoramic image, covering the entire bladder, for review by an urolgist at a later time or remote location. Global alignment of video frames is achieved by using a bundle adjuster that simultaneously recovers both the 3D structure of the bladder as well as the scope motion using only the video frames as input. The result of the algorithm is a complete 360° spherical panorama of the outer surface. The details of the software algorithms are presented here along with results from both a virtual cystoscopy as well from real endoscopic imaging of a bladder phantom. The software successfully stitched several hundred video frames into a single panoramic with subpixel accuracy and with no knowledge of the intrinsic camera properties, such as focal length and radial distortion. In the discussion, we outline future work in development of the software as well as identifying factors pertinent to clinical translation of this technology.


IEEE-ASME Transactions on Mechatronics | 2014

Controlling the Trajectory of a Flexible Ultrathin Endoscope for Fully Automated Bladder Surveillance

Matthew R. Burkhardt; Timothy D. Soper; Woon Jong Yoon; Eric J. Seibel

During cystoscopy, the urologist manually steers a cyst scope inside a patients bladder to visually inspect the inner surface. Cystoscopies are performed as part of surveillance for bladder cancer, making it the most expensive cancer to treat over a patients lifetime. An automated bladder scanning system has been devised to reduce workload and cost by relieving the urologist from performing surveillance. Presented here is a proof-of-concept apparatus that controls the motion of a miniature flexible endoscope. Image-based feedback is used to adjust the endoscopes movement so that captured images overlap with one another, ensuring that the entire inner surface of the bladder is imaged. Within a bladder phantom, the apparatus adaptively created and followed a spherical scan pattern comprised of 13 individual latitudes and 508 captured images, while accepting between 60% and 90% image overlap between adjacent images. The elapsed time and number of captured images were sensitive to the apparatuss placement within the phantom and the acceptable image overlap percentage range. A mosaic of captured images was generated to validate comprehensive surveillance. Overall, a robotically controlled endoscope used in conjunction with image-based feedback may permit fully automated and comprehensive bladder surveillance to be conducted without direct clinician oversight.


Proceedings of SPIE | 2009

Validation of CT-video registration for guiding a novel ultrathin bronchoscope to peripheral lung nodules using electromagnetic tracking

Timothy D. Soper; David R. Haynor; Robb W. Glenny; Eric J. Seibel

The development of an ultrathin scanning fiber bronchoscope (SFB) at the University of Washington permits bronchoscopic examination of small peripheral airways inaccessible to conventional bronchoscopes. Due to the extensive branching in higher generation airways, a form of bronchoscopic guidance is needed. For accurate intraoperative localization of the SFB, we propose a hybrid approach, using electromagnetic tracking (EMT) and 2D/3D registration of bronchoscopic video images to a preoperative CT scan. Three similarity metrics were evaluated for CT-video registration, including normalized mutual information (NMI), dark-weighted NMI (dw-NMI), and a surface gradient matching (SGM) strategy. From four bronchoscopic sessions, CT-video registration using SGM proved to be more robust than NMI-based metrics, averaging 320 frames of tracking before failure as compared with 100 and 160 frame averages for NMI and dw-NMI metrics respectively. In the hybrid configuration, EMT and CT-video registration were blended using a Kalman filter to recursively refine the registration error between the EMT system and airway anatomy. As part of the implementation, respiratory motion compensation (RMC) was implemented by adaptively estimating respiratory phase-dependent deformation. With the addition of RMC, average hybrid tracking disagreement with a set of manually registered key frames was 3.36 mm as compared with 6.30 mm when RMC was not used. In peripheral airway regions that undergo larger respiratory-induced deformation, disagreement was only 2.01 mm with RMC on average, as compared with 8.65 mm otherwise.


Medical Imaging 2007: Visualization and Image-Guided Procedures | 2007

An Interactive 3D User Interface for Guided Bronchoscopy

Indriyati Atmosukarto; Timothy D. Soper; Robb W. Glenny; Eric J. Seibel; Linda G. Shapiro

Recent studies have shown that more than 5 million bronchoscopy procedures are performed each year worldwide. The procedure usually involves biopsy of possible cancerous tissues from the lung. Standard bronchoscopes are too large to reach into the peripheral lung, where cancerous nodules are often found. The University of Washington has developed an ultrathin and flexible scanning fiber endoscope that is able to advance into the periphery of the human lungs without sacrificing image quality. To accompany the novel endoscope, we have developed a user interface that serves as a navigation guide for doctors when performing a bronchoscopy. The navigation system consists of a virtual surface mesh of the airways extracted from computed-tomography (CT) scan and an electromagnetic tracking system (EMTS). The complete system can be viewed as a global positioning system for the lung that provides pre-procedural planning functionalities, virtual bronchoscopy navigation, and real time tracking of the endoscope inside the lung. The real time virtual navigation is complemented by a particle filter algorithm to compensate for registration errors and outliers, and to prevent going through surfaces of the virtual lung model. The particle filter method tracks the endoscope tip based on real time tracking data and attaches the virtual endoscopic view to the skeleton that runs inside the virtual airway surface. Experiment results on a dried sheep lung show that the particle filter method converges and is able to accurately track the endoscope tip in real time when the endoscope is inserted both at slow and fast insertion speeds.


Proceedings of SPIE | 2013

Detecting fluorescence hot-spots using mosaic maps generated from multimodal endoscope imaging

Chenying Yang; Timothy D. Soper; Eric J. Seibel

Fluorescence labeled biomarkers can be detected during endoscopy to guide early cancer biopsies, such as high-grade dysplasia in Barretts Esophagus. To enhance intraoperative visualization of the fluorescence hot-spots, a mosaicking technique was developed to create full anatomical maps of the lower esophagus and associated fluorescent hot-spots. The resultant mosaic map contains overlaid reflectance and fluorescence images. It can be used to assist biopsy and document findings. The mosaicking algorithm uses reflectance images to calculate image registration between successive frames, and apply this registration to simultaneously acquired fluorescence images. During this mosaicking process, the fluorescence signal is enhanced through multi-frame averaging. Preliminary results showed that the technique promises to enhance the detectability of the hot-spots due to enhanced fluorescence signal.


Spie Newsroom | 2011

New approaches to bladder-surveillance endoscopy

Timothy D. Soper; Eric J. Seibel; Michael Porter

Bladder cancer is the fifth most common cancer in the United States1 and has a 50% recurrence rate. Consequently, patients undergo frequent surveillance, where a flexible endoscope is inserted into the bladder to detect recurrent tumors. The exam can be uncomfortable, in part because of the large (5mm) devices currently employed. While use of smaller endoscopes is desirable, they suffer from reduced resolution and field of view, making examination and detection challenging. Additionally, bladder surveillance constitutes a significant percentage of urologists’ time and resources, and is costly. While avenues for improved detection of bladder cancer—such as biomarkers,2 fluorescence imaging,3 and narrow-band imaging4—are areas of current investigation, conventional endoscopy remains the gold standard. Limitations of current devices have spurred development of mosaicking systems that generate panoramic views of the bladder. Constructed from multiple overlapping images, mosaics provide expanded views and greater visual context for in situ detection and assessment of mucosal changes associated with carcinoma. However, the resulting panoramas are limited to localized regions of the bladder and are unable to generate full, sweeping 360 views. Here, we report our developments toward automated surveillance that uses novel endoscopic technology and image-analysis software to reconstruct full 3D panoramas of the bladder.5 We developed an ultrathin scanning-fiber endoscope (SFE), whose small diameter (1.5mm) and superior imaging capabilities make it ideal for endoscopic surveillance (see Figure 1).6 In addition to mitigating patient discomfort, we have configured the SFE with an automated tip-bending system that allows machine-controlled surveillance endoscopy.7, 8 By employing a spiral scan trajectory, we can image the entire internal surface (see Figure 2). This operationally simple system could potentially be performed by a nurse or technician, thus freeing up the urologist’s time. In conjunction with these hardware Figure 1. Scanning-fiber endoscope image probe.


Archive | 2004

Catheterscope 3D guidance and interface system

Timothy D. Soper; Robb W. Glenny; Eric J. Seibel

Collaboration


Dive into the Timothy D. Soper's collaboration.

Top Co-Authors

Avatar

Eric J. Seibel

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Robb W. Glenny

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matthew R. Burkhardt

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chenying Yang

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Woon Jong Yoon

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Cameron M. Lee

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge