Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rahul Khare is active.

Publication


Featured researches published by Rahul Khare.


IEEE Transactions on Medical Imaging | 2013

Interactive CT-Video Registration for the Continuous Guidance of Bronchoscopy

Scott A. Merritt; Rahul Khare; Rebecca Bascom; William E. Higgins

Bronchoscopy is a major step in lung cancer staging. To perform bronchoscopy, the physician uses a procedure plan, derived from a patients 3D computed-tomography (CT) chest scan, to navigate the bronchoscope through the lung airways. Unfortunately, physicians vary greatly in their ability to perform bronchoscopy. As a result, image-guided bronchoscopy systems, drawing upon the concept of CT-based virtual bronchoscopy (VB), have been proposed. These systems attempt to register the bronchoscopes live position within the chest to a CT-based virtual chest space. Recent methods, which register the bronchoscopic video to CT-based endoluminal airway renderings, show promise but do not enable continuous real-time guidance. We present a CT-video registration method inspired by computer-vision innovations in the fields of image alignment and image-based rendering. In particular, motivated by the Lucas-Kanade algorithm, we propose an inverse-compositional framework built around a gradient-based optimization procedure. We next propose an implementation of the framework suitable for image-guided bronchoscopy. Laboratory tests, involving both single frames and continuous video sequences, demonstrate the robustness and accuracy of the method. Benchmark timing tests indicate that the method can run continuously at 300 frames/s, well beyond the real-time bronchoscopic video rate of 30 frames/s. This compares extremely favorably to the ≥1 s/frame speeds of other methods and indicates the methods potential for real-time continuous registration. A human phantom study confirms the methods efficacy for real-time guidance in a controlled setting, and, hence, points the way toward the first interactive CT-video registration approach for image-guided bronchoscopy. Along this line, we demonstrate the methods efficacy in a complete guidance system by presenting a clinical study involving lung cancer patients.


IEEE Transactions on Biomedical Engineering | 2014

Optimal Procedure Planning and Guidance System for Peripheral Bronchoscopy

Jason D. Gibbs; Michael W. Graham; Rebecca Bascom; Duane C. Cornish; Rahul Khare; William E. Higgins

With the development of multidetector computed-tomography (MDCT) scanners and ultrathin bronchoscopes, the use of bronchoscopy for diagnosing peripheral lung-cancer nodules is becoming a viable option. The work flow for assessing lung cancer consists of two phases: 1) 3-D MDCT analysis and 2) live bronchoscopy. Unfortunately, the yield rates for peripheral bronchoscopy have been reported to be as low as 14%, and bronchoscopy performance varies considerably between physicians. Recently, proposed image-guided systems have shown promise for assisting with peripheral bronchoscopy. Yet, MDCT-based route planning to target sites has relied on tedious error-prone techniques. In addition, route planning tends not to incorporate known anatomical, device, and procedural constraints that impact a feasible route. Finally, existing systems do not effectively integrate MDCT-derived route information into the live guidance process. We propose a system that incorporates an automatic optimal route-planning method, which integrates known route constraints. Furthermore, our system offers a natural translation of the MDCT-based route plan into the live guidance strategy via MDCT/video data fusion. An image-based study demonstrates the route-planning methods functionality. Next, we present a prospective lung-cancer patient study in which our system achieved a successful navigation rate of 91% to target sites. Furthermore, when compared to a competing commercial system, our system enabled bronchoscopy over two airways deeper into the airway-tree periphery with a sample time that was nearly 2 min shorter on average. Finally, our systems ability to almost perfectly predict the depth of a bronchoscopes navigable route in advance represents a substantial benefit of optimal route planning.


IEEE Transactions on Biomedical Engineering | 2015

Hands-Free System for Bronchoscopy Planning and Guidance

Rahul Khare; Rebecca Bascom; William E. Higgins

Bronchoscopy is a commonly used minimally invasive procedure for lung-cancer staging. In standard practice, however, physicians differ greatly in their levels of performance. To address this concern, image-guided intervention (IGI) systems have been devised to improve procedure success. Current IGI bronchoscopy systems based on virtual bronchoscopic navigation (VBN), however, require involvement from the attending technician. This lessens physician control and hinders the overall acceptance of such systems. We propose a hands-free VBN system for planning and guiding bronchoscopy. The system introduces two major contributions. First, it incorporates a new procedure-planning method that automatically computes airway navigation plans conforming to the physicians bronchoscopy training and manual dexterity. Second, it incorporates a guidance strategy for bronchoscope navigation that enables user-friendly system control via a foot switch, coupled with a novel position-verification mechanism. Phantom studies verified that the system enables smooth operation under physician control, while also enabling faster navigation than an existing technician-assisted VBN system. In a clinical human study, we noted a 97% bronchoscopy navigation success rate, in line with existing VBN systems, and a mean guidance time per diagnostic site = 52 s. This represents a guidance time often nearly 3 min faster per diagnostic site than guidance times reported for other technician-assisted VBN systems. Finally, an ergonomic study further asserts the systems acceptability to the physician and long-term potential.


Proceedings of SPIE | 2010

Toward image-based global registration for bronchoscopy guidance

Rahul Khare; William E. Higgins

Virtual image-based bronchoscopy guidance systems have been found to be useful for carrying out accurate and skillindependent bronchoscopies. A crucial step to the success of these systems during a live procedure is the local registration of the current real bronchoscope position to the virtual bronchoscope of the guidance system. The synchronization between the live and the virtual bronchoscope is generally lost during adverse events such as patient coughing with guidance often adversely disrupted. Manual intervention by an assisting technician often helps in recovering from such a disruption, but this results in extra procedure time and some potential uncertainty in the locally registered position. To rectify this difficulty, we present for the first time a global registration algorithm that identifies the bronchoscope position without the need for significant bronchoscope maneuvers or technician intervention. The method involves a fast local registration search over all the branches in a global airway-bifurcation search space, with the weighted normalized sum of squares distance metric used for finding the best match. We have achieved a global registration accuracy near 90% in tests over a set of three different virtual bronchoscopic cases and with live guidance in an airway phantom. The method shows considerable potential for enabling global technician-independent guidance of bronchoscopy, without the need for any external device such as an electromagnetic sensor.


Proceedings of SPIE | 2012

Optimization of CT-video registration for image-guided bronchoscopy

Rahul Khare; William E. Higgins

Global registration has been shown to be a potential reality for a bronchoscopy guidance system. Global registration involves establishing the bronchoscope position by comparing a given real bronchoscopic (RB) video view to target virtual bronchoscopic (VB) views derived from a patients three-dimensional (3D) multi-detector computed tomography (MDCT) chest scan. Registration performance depends significantly on the quality of the computer-generated VB views and the error metric used to compare the VB and RB views. In particular, the quality of the extracted endoluminal surfaces and the lighting model used during rendering has especial importance in determining VB view quality. Registration performance is also affected by the positioning of the bronchoscope during acquisition of the RB frame and the error metric used. We present a study considering the impact of these factors on global registration performance. Results show that using a direct-lighting-based model gives slightly better results than a global illumination model. However, the VB views generated by the global illumination model more closely resemble the RB views when using the weighted normalized sum-of-square error (WNSSE) metric. Results also show that the best global registration results are obtained by using a computergenerated bronchoscope-positioning target with a WNSSE metric and a direct-lighting model. We also identify the best airway surface-extraction method for global registration.


Proceedings of SPIE | 2009

Improved Navigation for Image-Guided Bronchoscopy

Rahul Khare; Kun-Chang Yu; William E. Higgins

Past work has shown that guidance systems help improve both the navigation through airways and final biopsy of regions of interest via bronchoscopy. We have previously proposed an image-based bronchoscopic guidance system. The system, however, has three issues that arise during navigation: 1) sudden disorienting changes can occur in endoluminal views; 2) more feedback could be afforded during navigation; and 3) the systems graphical user interface (GUI) lacks a convenient interface for smooth navigation between bifurcations. In order to alleviate these issues, we present an improved navigation system. The improvements offer the following: 1) an enhanced visual presentation; 2) smooth navigation; 3) an interface for handling registration errors; and 4) improved bifurcation-point identification. The improved navigation system thus provides significant ergonomic and navigational advantages over the previous system.


Proceedings of SPIE | 2013

Technician-free system for image-guided bronchoscopy

Rahul Khare; Rebecca Bascom; William E. Higgins

Previous studies have shown that guidance systems improve accuracy and reduce skill variation among physicians during bronchoscopy. However, most of these systems suffer from one or more of the following limitations: 1) an attending technician must carefully keep the system position synchronized with the bronchoscope position during the procedure; 2) extra bronchoscope tracking hardware may be required; 3) guidance cannot take place in real time; 4) the guidance system is unable to detect and correct faulty bronchoscope maneuvers; and 5) a resynchronization procedure must be followed after adverse events such as patient cough or dynamic airway collapse. Here, we propose an image-based system for technician-free bronchoscopy guidance that relies on two features. First, our system precomputes a guidance plan that suggests natural bronchoscope maneuvers at every bifurcation leading toward a region of interest (ROI). Second, our system enables bronchoscope position verification that relies on a global-registration algorithm to establish the global bronchoscope position and, thus, provide the physician with updated navigational information during bronchoscopy. The system can handle general navigation to an ROI, as well as adverse events, and is directly controlled by the physician by a foot pedal. Guided bronchoscopy results using airway-tree phantoms and human cases demonstrate the efficacy of the system.


Proceedings of SPIE | 2011

Image-based global registration system for bronchoscopy guidance

Rahul Khare; William E. Higgins

Previous studies have shown that bronchoscopy guidance systems improve accuracy and reduce skill variation among physicians during bronchoscopy. In the past, we presented an image-based bronchoscopy guidance system that has been extensively validated in live bronchoscopic procedures. However, this system cannot actively recover from adverse events, such as patient coughing or dynamic airway collapses. After such events, the bronchoscope position is recovered only by moving back to a previously seen and easily identifiable bifurcation such as the main carina. Furthermore, the system requires an attending technician to closely follow the physicians movement of the bronchoscope to avoid misguidance. Also, when the physician is forced to advance the bronchoscope across multiple bifurcations, the system is not able to detect faulty maneuvers. We propose two system-level solutions. The first solution is a system-level guidance strategy that incorporates a global-registration algorithm to provide the physician with updated navigational and guidance information during bronchoscopy. The system can handle general navigation to a region of interest (ROI), as well as adverse events, and it requires minimal commands so that it can be directly controlled by the physician. The second solution visualizes the global picture of all the bifurcations and their relative orientations in advance and suggests the maneuvers needed by the bronchoscope to approach the ROI. Guided bronchoscopy results using human airway-tree phantoms demonstrate the potential of the two solutions.


Archive | 2011

IMAGE-BASED GLOBAL REGISTRATION SYSTEM AND METHOD APPLICABLE TO BRONCHOSCOPY GUIDANCE

William E. Higgins; Rahul Khare; Scott A. Merritt


Archive | 2012

Global and semi-global registration for image-based bronchoscopy guidance

William E. Higgins; Rahul Khare

Collaboration


Dive into the Rahul Khare's collaboration.

Top Co-Authors

Avatar

William E. Higgins

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Rebecca Bascom

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Duane C. Cornish

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Jason D. Gibbs

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Michael W. Graham

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Kun-Chang Yu

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Scott A. Merritt

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Mohammed Yeasin

State University of New York Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar

Pinyo Taeprasartsit

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Rajeev Sharma

Pennsylvania State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge