Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dongshan Fu is active.

Publication


Featured researches published by Dongshan Fu.


Operative Neurosurgery | 2007

A Study of the Accuracy of CyberKnife Spinal Radiosurgery Using Skeletal Structure Tracking

Anthony Ho; Dongshan Fu; Cristian Cotrutz; Steven L. Hancock; Steven D. Chang; Iris C. Gibbs; Calvin R. Maurer; John R. Adler

OBJECTIVE New technology has enabled the increasing use of radiosurgery to ablate spinal lesions. The first generation of the CyberKnife (Accuray, Inc., Sunnyvale, CA) image-guided radiosurgery system required implanted radiopaque markers (fiducials) to localize spinal targets. A recently developed and now commercially available spine tracking technology called Xsight (Accuray, Inc.) tracks skeletal structures and eliminates the need for implanted fiducials. The Xsight system localizes spinal targets by direct reference to the adjacent vertebral elements. This study sought to measure the accuracy of Xsight spine tracking and provide a qualitative assessment of overall system performance. METHODS Total system error, which is defined as the distance between the centroids of the planned and delivered dose distributions and represents all possible treatment planning and delivery errors, was measured using a realistic, anthropomorphic head-and-neck phantom. The Xsight tracking system error component of total system error was also computed by retrospectively analyzing image data obtained from eleven patients with a total of 44 implanted fiducials who underwent CyberKnife spinal radiosurgery. RESULTS The total system error of the Xsight targeting technology was measured to be 0.61 mm. The tracking system error component was found to be 0.49 mm. CONCLUSION The Xsight spine tracking system is practically important because it is accurate and eliminates the use of implanted fiducials. Experience has shown this technology to be robust under a wide range of clinical circumstances.


Medical Physics | 2008

A fast, accurate, and automatic 2D-3D image registration for image-guided cranial radiosurgery.

Dongshan Fu; Gopinath Kuduvalli

The authors developed a fast and accurate two-dimensional (2D)-three-dimensional (3D) image registration method to perform precise initial patient setup and frequent detection and correction for patient movement during image-guided cranial radiosurgery treatment. In this method, an approximate geometric relationship is first established to decompose a 3D rigid transformation in the 3D patient coordinate into in-plane transformations and out-of-plane rotations in two orthogonal 2D projections. Digitally reconstructed radiographs are generated offline from a preoperative computed tomography volume prior to treatment and used as the reference for patient position. A multiphase framework is designed to register the digitally reconstructed radiographs with the x-ray images periodically acquired during patient setup and treatment. The registration in each projection is performed independently; the results in the two projections are then combined and converted to a 3D rigid transformation by 2D-3D geometric backprojection. The in-plane transformation and the out-of-plane rotation are estimated using different search methods, including multiresolution matching, steepest descent minimization, and one-dimensional search. Two similarity measures, optimized pattern intensity and sum of squared difference, are applied at different registration phases to optimize accuracy and computation speed. Various experiments on an anthropomorphic head-and-neck phantom showed that, using fiducial registration as a gold standard, the registration errors were 0.33 +/- 0.16 mm (s.d.) in overall translation and 0.29 degrees +/- 0.11 degrees (s.d.) in overall rotation. The total targeting errors were 0.34 +/- 0.16 mm (s.d.), 0.40 +/- 0.2 mm (s.d.), and 0.51 +/- 0.26 mm (s.d.) for the targets at the distances of 2, 6, and 10 cm from the rotation center, respectively. The computation time was less than 3 s on a computer with an Intel Pentium 3.0 GHz dual processor.


Archive | 2007

Xsight Lung Tracking System: A Fiducial-Less Method for Respiratory Motion Tracking

Dongshan Fu; Robert Kahn; Bai Wang; Hongwu Wang; Zhiping Mu; Jong Park; Gopinath Kuduvalli; Calvin R. Maurer

The CyberKnife® Robotic Radiosurgery System (Accuray Incorporated, Sunnyvale, CA) can treat targets that move with respiration using the Synchrony® Respiratory Tracking System (Accuray Incorporated, Sunnyvale, CA). Alignment of each treatment beam with the moving target is maintained in real time by moving the beam dynamically with the target. The Synchrony system requires fiducials that are placed in or near the tumor to target the lesion and track it as it moves with respiration. The Xsightℳ (Accuray Incorporated, Sunnyvale, CA) Lung Tracking System, which recently became available for the CyberKnife system, is a direct soft tissue tracking method for respiratory motion tracking of lung lesions that eliminates invasive fiducial implantation procedures, thereby decreasing the time to treatment and eliminating the risk of pneumothorax and other fiducial placement complications. This chapter presents the concepts, methods, and some experimental results of the Xsight Lung Tracking System, which is fully integrated with the Synchrony Respiratory Tracking System. Observation and analysis of clinical image data for patients previously treated with the CyberKnife indicates that many reasonably large tumors (larger than 15 mm) located in the peripheral and apex lung regions are visible in orthogonal X-ray images acquired by the CyberKnife system. Direct tumor tracking can be performed for such visible tumors by registration of the tumor region in digitally reconstructed radiographs (DRRs), generated from the planning CT image, to the corresponding region in the treatment X-ray images. Image processing is used to enhance the visibility of the lung tumor in the DRRs and X-ray images. Experiments with an anthropomorphic motion phantom and retrospective analysis of clinical image data obtained from patients who underwent CyberKnife treatment for lung lesions using implanted fiducial markers show that the accuracy of Xsight Lung tracking is better than 1.5 mm.


Medical Imaging 2005: Visualization, Image-Guided Procedures, and Display | 2005

Automated skull tracking for the CyberKnife image-guided radiosurgery system

Dongshan Fu; Gopinath Kuduvalli; Vladimir Mitrovic; William Main; Larry Thomson

We have developed an automated skull tracking method to perform near real-time patient alignment and position correction during CyberKnife image-guided intracranial radiosurgery. Digitally reconstructed radiographs (DRRs) are first generated offline from a CT study before treatment, and are used as reference images for the patient position. Two orthogonal projection X-ray images are then acquired at the time of patient alignment or treatment. Multi-phase registration is used to register the DRRs with the X-ray images. The registration in each projection is carried out independently; the results are then combined and converted to a 3-D rigid transformation. The in-plane transformation and the out-of-plane rotations are estimated using different search methods including multi-resolution matching, steepest descent minimization and one-dimensional search. Two similarity measure methods, optimized pattern intensity and sum of squared difference (SSD), are applied at different search phases to optimize both accuracy and computation speed. Experiments on an anthropomorphic skull phantom showed that the tracking accuracy (RMS error) is better than 0.3 mm for each translation and better than 0.3 degree for each rotation, and the targeting accuracy (clinically relevant accuracy) tested with the CyberKnife system is better than 1 mm. The computation time required for the tracking algorithm is within a few seconds.


computer vision and pattern recognition | 2006

Multiple Fiducial Identification Using the Hidden Markov Model in Image Guided Radiosurgery

Zhiping Mu; Dongshan Fu; Gopinath R. Kuduvalli

A multi-fiducial identification method for image guided radiotherapy and radiosurgery is presented. A modified hidden Markov model is adopted to incorporate context information. A novel algorithm, modified from the Viterbi algorithm, is introduced to identify fiducials concurrently in two orthogonal projections. This method is robust and efficient, requires a small number of control parameters, exhibits large search range, and can accommodate deformations. A simple implementation is presented as an example to verify the efficacy of the framework. Experiments were carried out using clinical images acquired during patient treatments, and very promising results were achieved and reported in the paper. The algorithm successfully identified fiducials even in very difficult cases, demonstrating the effectiveness and robustness of the proposed probabilistic framework and the concurrent search algorithm.


Medical Imaging 2007: Visualization and Image-Guided Procedures | 2007

Fiducial-less 2D-3D spine image registration using spine region segmented in CT image

Dongshan Fu; Hongwu Wang; Calvin R. Maurer; Gopinath Kuduvalli

The target pose (position and orientation) of a spinal lesion can be determined using image registration of a pair of two-dimensional (2D) x-ray projection images and a pre-treatment three-dimensional (3D) CT image. This is useful for detecting, tracking and correcting for patient movement during image-guided spinal radiotherapy and radiosurgery. We recently developed a fiducial-less 2D-3D spine image registration that localizes spinal targets by directly tracking adjacent skeletal structures and thereby eliminates the need for implanted fiducials. Experience has shown this method to be robust under a wide range of clinical circumstances. However, image artifacts in digitally reconstructed radiographs (DRRs) that can be introduced by breathing during CT scanning or by other surrounding structures such as ribs have the negative effects on image registration performance. Therefore, we present an approach to eliminate the image artifacts in DRRs for a more robust registration. The spinal structures in the CT volume are approximately segmented in a semi-automatic way and saved as a volume of interest (VOI). The DRRs are then generated within the spine VOI for two orthogonal projections. During radiation treatment delivery, two X-ray images are acquired simultaneously in near real time. Then each X-ray image is registered with the DRR image to obtain 2D local displacements of skeletal structures. The 3D tumor position is calculated from the 2D displacements by 2D-to-3D back-projection and geometric transformation. Experiments on clinical data were conducted to evaluate the performance of the improved registration. The results showed that spine segmentation substantially improves image registration performance.


Medical Imaging 2007: Visualization and Image-Guided Procedures | 2007

Evaluation of a robust fiducial tracking algorithm for image-guided radiosurgery

Serkan Hatipoglu; Zhiping Mu; Dongshan Fu; Gopinath R. Kuduvalli

Fiducial tracking is a widely used method in image guided procedures such as image guided radiosurgery and radiotherapy. Our group has developed a new fiducial identification algorithm, concurrent Viterbi with association (CVA) algorithm, based on a modified Hidden Markov Model (HMM), and reported our initial results previously. In this paper, we present an extensive performance evaluation of this novel algorithm using phantom testing and clinical images acquired during patient treatment. For a common three-fiducial case, the algorithm execution time is less than two seconds. Testing with a collection of images from more than 35 patient treatments, with a total of more than 10000 image pairs, we find that the success rate of the new algorithm is better than 99%. In the tracking test using a phantom, the phantom is moved to a variety of positions with translations up to 8 mm and rotations up to 4 degree. The new algorithm correctly tracks the phantom motion, with an average translation error of less than 0.5 mm and rotation error less than 0.5 degrees. These results demonstrate that the new algorithm is very efficient, robust, easy to use, and capable of tracking fiducials in a large region of interest (ROI) at a very high success rate with high accuracy.


Medical Physics | 2009

TH‐C‐303A‐02: Clinical Data Evaluation of Fiducial‐Free Spine Tracking for CyberKnife Radiosurgery

Dongshan Fu; H Zhang; H Wang; B Wang; Gopinath Kuduvalli; Calvin R. Maurer

Purpose: To evaluate the accuracy of the Xsight® Spine Tracking (XST) System in the CyberKnife® RoboticRadiosurgery System (Accuray Incorporated, Sunnyvale, CA) using retrospective analysis of clinical data. Method and Materials: The XST System performs patient alignment and frequent intra‐fractional tracking for spine radiosurgery. The 3 translations and 3 rotations of the 3D transformation are computed and used for treatment couch correction in patient setup and radiation beam position compensation during treatment delivery. The XST System eliminates the need for fiducials by using 2D–3D spine registration of two orthogonal X‐ray images and the planning CTimage.Analysis was performed using image data acquired for 26 patients previously treated using fiducial tracking. The data consists of a CTimage for each patient plus 4,480 X‐ray image pairs acquired during treatment. The cases cover the entire spinal column (3 cervical, 13 thoracic, 7 lumbar and 3 sacrum). Each patient had 4–6 metal fiducials implanted in vertebrae adjacent to the spine lesion being treated. Fiducial and XST tracking were performed for all X‐ray image pairs. The XST transformation errors were calculated by using the fiducial tracking results as the reference gold standard. Results: The error of each translation component is 1 mm translation error and 3 patients with >1° rotation error, the fiducials are far from the center of region of interest, which degrades the fiducial tracking accuracy. Conclusion: The XST System robustly tracks all spine regions and accurately computes both translations and rotations. Research sponsored by Accuray Incorporated.


Archive | 2004

Fiducial-less tracking with non-rigid image registration

Dongshan Fu; Gopinath Kuduvalli


Archive | 2004

DRR generation using a non-linear attenuation model

Dongshan Fu; Gopinath Kuduvalli

Collaboration


Dive into the Dongshan Fu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge