Lixin Gong
University of Washington
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lixin Gong.
IEEE Transactions on Medical Imaging | 2004
Lixin Gong; Sayan D. Pathak; David R. Haynor; Paul S. Cho; Yongmin Kim
Automatic prostate segmentation in ultrasound images is a challenging task due to speckle noise, missing boundary segments, and complex prostate anatomy. One popular approach has been the use of deformable models. For such techniques, prior knowledge of the prostate shape plays an important role in automating model initialization and constraining model evolution. In this paper, we have modeled the prostate shape using deformable superellipses. This model was fitted to 594 manual prostate contours outlined by five experts. We found that the superellipse with simple parametric deformations can efficiently model the prostate shape with the Hausdorff distance error (model versus manual outline) of 1.32/spl plusmn/0.62 mm and mean absolute distance error of 0.54/spl plusmn/0.20 mm. The variability between the manual outlinings and their corresponding fitted deformable superellipses was significantly less than the variability between human experts with p-value being less than 0.0001. Based on this deformable superellipse model, we have developed an efficient and robust Bayesian segmentation algorithm. This algorithm was applied to 125 prostate ultrasound images collected from 16 patients. The mean error between the computer-generated boundaries and the manual outlinings was 1.36/spl plusmn/0.58 mm, which is significantly less than the manual interobserver distances. The algorithm was also shown to be fairly insensitive to the choice of the initial curve.
International Journal of Radiation Oncology Biology Physics | 2002
Lixin Gong; Paul S. Cho; Ben H. Han; Kent E. Wallner; Steve G. Sutlief; Sayan D. Pathak; David R. Haynor; Yongmin Kim
PURPOSE To investigate the feasibility of performing postimplant and intraoperative dosimetry for prostate brachytherapy by fusing transrectal ultrasound (TRUS) and fluoroscopic data. METHODS AND MATERIALS Registration of ultrasound (prostate boundary) and fluoroscopic (seed) data requires spatial markers that are detectable by both imaging modalities. In this study, the needle tips were considered as such fiducials. Prostate phantoms were implanted with the seeds, and four localization needles were inserted. In the TRUS frame of reference, the longitudinal coordinate of the needle tip was determined by advancing the needle until the echo from its tip just registered at a known probe depth. The tips transverse coordinates were determined from the associated TRUS slice. The three-dimensional needle tip positions were also calculated in the fluoroscopic coordinate system using a seed reconstruction method. The transformation between the TRUS and fluoroscopy coordinate systems was established by the least-squares solution using the singular value decomposition. RESULTS With three of four needle tips as fiducials and the one remaining needle as a test target, the mean fiducial registration error was 0.8 mm and the test target registration error was 2.5 mm. When all four points were used for registration, the errors decreased to 1.1 mm. A comparison between the proposed method and CT-based dosimetry yielded a percentage of prostate volume receiving 100% and 150% of the prescribed minimal peripheral dose and minimal dose received by 90% of the prostate gland that agreed within 0.4%, 2.7%, and 4.2%, respectively. CONCLUSION The combination of TRUS and fluoroscopy is a feasible alternative to the currently used CT-based postimplant dosimetry. Furthermore, because of online imaging capability, the method lends itself to real-time intraoperative applications.
IEEE Transactions on Medical Imaging | 2006
Ismail B. Tutar; Sayan D. Pathak; Lixin Gong; Paul S. Cho; Kent E. Wallner; Yongmin Kim
Prostate brachytherapy quality assessment procedure should be performed while the patient is still on the operating table since this would enable physicians to implant additional seeds immediately into the prostate if necessary thus reducing the costs and increasing patient outcome. Seed placement procedure is readily performed under fluoroscopy and ultrasound guidance. Therefore, it has been proposed that seed locations be reconstructed from fluoroscopic images and prostate boundaries be identified in ultrasound images to perform dosimetry in the operating room. However, there is a key hurdle that needs to be overcome to perform the ultrasound and fluoroscopy-based dosimetry: it is highly time-consuming for physicians to outline prostate boundaries in ultrasound images manually, and there is no method that enables physicians to identify three-dimensional (3-D) prostate boundaries in postimplant ultrasound images in a fast and robust fashion. In this paper, we propose a new method where the segmentation is defined in an optimization framework as fitting the best surface to the underlying images under shape constraints. To derive these constraints, we modeled the shape of the prostate using spherical harmonics of degree eight and performed statistical analysis on the shape parameters. After user initialization, our algorithm identifies the prostate boundaries on the average in 2 min. For algorithm validation, we collected 30 postimplant prostate volume sets, each consisting of axial transrectal ultrasound images acquired at 1-mm increments. For each volume set, three experts outlined the prostate boundaries first manually and then using our algorithm. By treating the average of manual boundaries as the ground truth, we computed the segmentation error. The overall mean absolute distance error was 1.26plusmn0.41 mm while the percent volume overlap was 83.5plusmn4.2. We found the segmentation error to be slightly less than the clinically-observed interobserver variability
Medical Physics | 2008
Ismail B. Tutar; Lixin Gong; Sreeram Narayanan; Sayan D. Pathak; Paul S. Cho; Kent E. Wallner; Yongmin Kim
Prostate brachytherapy is an effective treatment option for early-stage prostate cancer. During a prostate brachytherapy procedure, transrectal ultrasound (TRUS) and fluoroscopy imaging modalities complement each other by providing good visualization of soft tissue and implanted seeds, respectively. Therefore, the registration of these two imaging modalities, which are readily available in the operating room, could facilitate intraoperative dosimetry, thus enabling physicians to implant additional seeds into the underdosed portions of the prostate while the patient is still on the operating table. It is desirable to register TRUS and fluoroscopy images by using the seeds as fiducial markers. Although the locations of all the implanted seeds can be reconstructed from three fluoroscopy images, only a fraction of these seeds can be located in TRUS images. It is challenging to register the TRUS and fluoroscopy images by using the identified seeds, since the correspondence between them is unknown. Furthermore, misdetection of nonseed structures as seeds can lead to the inclusion of spurious points in the data set. We developed a new method called iterative optimal assignment (IOA) to overcome these challenges in TRUS-fluoroscopy registration. By using the Hungarian method in an optimization framework, IOA computes a set of transformation parameters that yield the one-to-one correspondence with minimum cost. We have evaluated our registration method at varying noise levels, seed detection rates, and number of spurious points using data collected from 25 patients. We have found that IOA can perform registration with an average root mean square error of about 0.2 cm even when the seed detection rate is only 10%. We believe that IOA can offer a robust solution to seed-based TRUS-fluoroscopy registration, thus making intraoperative dosimetry possible.
Medical Imaging 2005: Image Processing | 2005
Lixin Gong; Lydia Ng; Sayan D. Pathak; Ismail B. Tutar; Paul S. Cho; David R. Haynor; Yongmin Kim
Prostate segmentation in ultrasound images is a clinically important and technically challenging task. Despite several research attempts, few effective methods are available. One problem is the limited algorithmic robustness to common artifacts in clinical data sets. To improve the robustness, we have developed a hybrid level set method, which incorporates shape constraints into a region-based curve evolution process. The online segmentation method alternates between two steps, namely, shape model estimation (ME) and curve evolution (CE). The prior shape information is encoded in an implicit parametric model derived offline from manually outlined training data. Utilizing this prior shape information, the ME step tries to compute the maximum a posteriori estimate of the model parameters. The estimated shape is then used to guide the CE step, which in turn provides a new model initialization for the ME step. The process stops automatically when the curve locks onto the specific prostate shape. The ME and the CE steps complement each other to capture both global and local shape details. With shape guidance, this algorithm is less sensitive to initial contour placement and more robust even in the presence of large boundary gaps and strong clutter. Promising results are demonstrated on both synthetic and real prostate ultrasound images.
Computerized Medical Imaging and Graphics | 2006
Lixin Gong; Sayan D. Pathak; Adam M. Alessio; Paul E. Kinahan
Positron emission tomography (PET) imaging is rapidly expanding its role in clinical practice for cancer management. The high sensitivity of PET for functional abnormalities associated with cancer can be confounded by the minimal anatomical information it provides for cancer localization. Computed tomography (CT) provides detailed anatomical information but is less sensitive to pathologies than PET. Thus, combining (i.e., registering) PET and CT images would enable both accurate and sensitive cancer localization with respect to detailed patient anatomy. An additional application area of registration is to align CT-CT scans from serial studies on a patient on a PET/CT scanner to facilitate accurate assessment of therapeutic response from the co-aligned PET images. To facilitate image fusion, we are developing a deformable registration software system using mutual information and a B-spline model of the deformation. When applying deformable registration to whole body images, one of the obstacles is that the arms are present in PET images but not in CT images or are in different positions in serial CT images. This feature mismatch requires a preprocessing step to remove the arms where present and thus adds a manual step in an otherwise automatic algorithm. In this paper, we present a simple yet effective method for automatic arm removal. We demonstrate the efficiency and robustness of this algorithm on both clinical PET and CT images. By streamlining the entire registration process, we expect that the fusion technology will soon find its way into clinics, greatly benefiting cancer diagnosis, staging, therapy planning and treatment monitoring.
Medical Imaging 2005 - Visualization, Image-Guided Procedures, and Display | 2005
Christopher Lau; Sayan D. Pathak; Lixin Gong; Paul E. Kinahan; Phillip M. Cheng; Lydia Ng
Cancer management using positron emission tomography (PET) imaging is rapidly expanding its role in clinical practice. The high sensitivity of PET to locate cancer can be confounded by the minimal anatomical information it provides. Additional anatomical information would greatly benefit diagnosis, staging, therapy planning and treatment monitoring. Computed tomography (CT) provides detailed anatomical information but is less sensitive towards cancer localization than PET. Combining PET and CT images would enable accurate localization of the functional information with respect to detailed patient anatomy. We have developed a software platform to facilitate efficient visualization of PET/CT image studies. We used a deformable registration algorithm using mutual information and a B-spline model of the deformation. Several useful visualization modes were implemented with an efficient and robust method for switching between modes and handling large datasets. Processing of several studies can be queued and the results browsed. The software has been validated with clinical data.
Medical Imaging 2003: Visualization, Image-Guided Procedures, and Display | 2003
Lixin Gong; Sayan D. Pathak; David R. Haynor; Paul S. Cho; Yongmin Kim
Automated prostate segmentation in ultrasound images is a challenging task due to speckle noise, missing edge segments, and complex prostate peripheral anatomy. In this paper, a Bayesian prostate segmentation algorithm is presented. It combines both prior shape and image information for robust segmentation. In this study, the prostate shape was efficiently modeled using deformable superellipse. A flexible graphical user interface has been developed to facilitate the validation of our algorithm in a clinical setting. This algorithm was applied to 66 ultrasound images collected from 8 patients. The resulting mean error between the computer-generated boundaries and the manually-outlined boundaries was 1.39 ± 0.60 mm, which is significantly less than the variability between human experts.
Medical Imaging 2001: Visualization, Display, and Image-Guided Procedures | 2001
Lixin Gong; Sayan D. Pathak; Yongmin Kim
Ultrasound image segmentation is challenging due to speckles, depth-dependent signal attenuation, low signal-to- noise ratio, and direction-dependent edge contrast. In addition, transrectal ultrasound (TRUS) prostate images are often corrupted by acoustic shadowing caused by calcifications, bowel gas, protein deposit artifacts, etc., making segmentation difficult. In such cases, traditional edge detection algorithms without adequate preprocessing have limited success. The original sticks algorithm reduces speckles while enhancing contrast. It assumes that in a pixel neighborhood, reflectors of different orientations with respect to the incident ultrasound beam are equally likely, which is not the cast in practice. Even though some variations of the original sticks algorithm estimate poor probabilities from the image or from the imaging process, no high-level information about the geometry of the object of interest is utilized. As a result, both non-prostate structures and the true boundaries are equally enhanced. This paper presents an extension to the original sticks algorithm, which incorporates high-level knowledge of prostate shape to selectively enhance the prostate edge contrast while suppressing non-prostate structures. The improved algorithm shows that this extension preserves the prostate boundaries while providing superior noise reduction especially in the interior prostate region, which can lead to more accurate segmentation of the prostate.
Medical Physics | 2007
Paul S. Cho; Ismail B. Tutar; Sreeram Narayanan; S Lam; Lixin Gong; Sayan D. Pathak; Yongmin Kim; Kent E. Wallner
Purpose: (1) To develop an intra‐operative dose evaluation and optimization system for prostate brachytherapy using complementary imaging modalities (fluoroscopy for seed localization and ultrasound for prostate delineation). (2) To evaluate the usefulness of the intra‐operative dosimetry system. Method and Materials: Several algorithms were developed to realize the concept of fluoroscopy‐ultrasound based dosimetry. These include automated seed detection, 3D seed reconstruction, patient motion correction, computer‐assisted prostate contouring, and seeds‐prostate registration algorithms. In order to accelerate the seed reconstruction from multiple x‐ray projections, the dimensionality of seed matching process was reduced by strategically forming search restriction sub‐images. Problems of incomplete data due to clustering and superposition of projected seeds were solved by a pseudo‐matching technique. Patient movements between fluoroscopic image acquisitions were corrected through analysis of epipolar‐planes. Registration of seeds and prostate volume was accomplished by establishing a sparse (small subset of seeds detected on ultrasound) to full (seeds reconstructed from fluoroscopy) data set correspondence. Results: Intra‐operative dosimetry was performed on 25 patients implanted with Pd‐ 103 seeds. In nine patients, no additional seeds were implanted after intraoperative dose evaluation. In 16 patients, remedial seed were added based on the intra‐operative evaluation. Subsequently, V100 values improved from 86±8% to 93±4% (p=0.005). For all 25 patients, the post‐implant dosimetry results using our system were compared with Day 0 CT‐study. The V100 and D90 values from the fluoroscopy‐ultrasound based dosimetry were 95±4% and 120±24% while the CT‐based dosimetry computed 95±4% and 122±24%, respectively. Conclusion: We have developed and implemented an intra‐operative dosimetry system that combines the strengths of fluoroscopy and ultrasound. A clinical study has successfully demonstrated its intended utility in intra‐operative setting by significantly improving the quality of seed implant in prostate brachytherapy.