Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mingyue Ding is active.

Publication


Featured researches published by Mingyue Ding.


Medical Physics | 2003

A real‐time biopsy needle segmentation technique using Hough Transform

Mingyue Ding; Aaron Fenster

Real-time needle segmentation and tracking is very important in image-guided surgery, biopsy, and therapy. Due to its robustness to the addition of extraneous noise, the Hough Transform is one of the most powerful line-detection techniques nowadays and has been widely used in different areas. Unfortunately, its high computation needs often prevent it from being applied in real-time applications without the help of specially designed hardware. In order to solve this problem, a variety of fast implementation algorithms have been proposed. However, none of them can be performed in a real time on an affordable computer. In this paper, we describe a fast implementation of the Hough Transform based on coarse-fine search and the determination of the optimal image resolution. Compared to conventional techniques, our approach decreases the time for needle segmentation by an order of magnitude. Experiments with agar phantom and patient breast biopsy ultrasound (US) image sequences showed that our approach can segment the biopsy needle in real time (i.e., less than 33 ms) on an affordable PC computer without the help of specially designed hardware with the angular rms error of about 1 degrees and the position rms error of about 0.5 mm.


Medical Physics | 2003

Automatic needle segmentation in three-dimensional ultrasound images using two orthogonal two-dimensional image projections

Mingyue Ding; H. Neale Cardinal; Aaron Fenster

In this paper, we describe an algorithm to segment a needle from a three-dimensional (3D) ultrasound image by using two orthogonal two-dimensional (2D) image projections. Not only is the needle more conspicuous in a projected (volume-rendered) image, but its direction in 3D lies in the plane defined by the projection direction and the needle direction in the projected 2D image. Hence, using two such projections, the 3D vector describing the needle direction lies along the intersection of the two corresponding planes. Thus, the task of 3D needle segmentation is reduced to two 2D needle segmentations. For improved accuracy and robustness, we use orthogonal projection directions (both orthogonal to a given a priori estimate of the needle direction), and use volume cropping and Gaussian transfer functions to remove complex background from the 2D projection images. To evaluate our algorithm, we tested it with 3D ultrasound images of agar and turkey breast phantoms. Using a 500 MHz personal computer equipped with a commercial volume-rendering card, we found that our 3D needle segmentation algorithm performed in near real time (about 10 fps) with a root-mean-square accuracy in needle length and endpoint coordinates of better than 0.8 mm, and about 0.5 mm on average, for needles lengths in the 3D image from 4.0 mm to 36.7 mm.


medical image computing and computer assisted intervention | 2005

3D TRUS guided robot assisted prostate brachytherapy

Zhouping Wei; Mingyue Ding; Donal B. Downey; Aaron Fenster

This paper describes a system for dynamic intraoperative prostate brachytherapy using 3D ultrasound guidance with robot assistance. The system consists of 3D transrectal ultrasound (TRUS) imaging, a robot and software for prostate segmentation, 3D dose planning, oblique needle segmentation and tracking, seed segmentation, and dynamic re-planning and verification. The needle targeting accuracy of the system was 0.79 mm +/- 0.32 mm in a phantom study.


Medical Imaging 2003: Visualization, Image-Guided Procedures, and Display | 2003

Prostate segmentation in 3D US images using the cardinal-spline-based discrete dynamic contour

Mingyue Ding; Congjin Chen; Yunqiu Wang; Igor Gyacskov; Aaron Fenster

Our slice-based 3D prostate segmentation method comprises of three steps. 2) Boundary deformation. First, we chose more than three points on the boundary of the prostate along one direction and used a Cardinal-spline to interpolate an initial prostate boundary, which has been divided into vertices. At each vertex, the internal and external forces were calculated. These forces drived the evolving contour to the true boundary of the prostate. 3) 3D prostate segmentation. We propoaged the final contour in the initial slice to adjacent slices and refined them until all prostate boundaries of slices are segmented. Finally, we calculated the volume of the prostate from a 3D mesh surface of the prostate. Experiments with the 3D US images of six patient prostates demonstrated that our method efficiently avoided being trapped in local minima and the average percentage error was 4.8%. In 3D prostate segementation, the average percentage error in measuring the prostate volume is less than 5%, with respect to the manual planimetry.


Medical Imaging 2002: Visualization, Image-Guided Procedures, and Display | 2002

Automatic needle segmentation in 3D ultrasound images

Mingyue Ding; H. Neale Cardinal; Weiguang Guan; Aaron Fenster

In this paper, we propose to use 2D image projections to automatically segment a needle in a 3D ultrasound image. This approach is motivated by the twin observations that the needle is more conspicuous in a projected image, and its projected area is a minimum when the rays are cast parallel to the needle direction. To avoid the computational burden of an exhaustive 2D search for the needle direction, a faster 1D search procedure is proposed. First, a plane which contains the needle direction is determined by the initial projection direction and the (estimated) direction of the needle in the corresponding projection image. Subsequently, an adaptive 1D search technique is used to adjust the projection direction iteratively until the projected needle area is minimized. In order to remove noise and complex background structure from the projection images, a priori information about the needle position and orientation is used to crop the 3D volume, and the cropped volume is rendered with Gaussian transfer functions. We have evaluated this approach experimentally using agar and turkey breast phantoms. The results show that it can find the 3D needle orientation within 1 degree, in about 1 to 3 seconds on a 500 MHz computer.


international conference of the ieee engineering in medicine and biology society | 2005

Slice-Based Prostate Segmentation in 3D US Images Using Continuity Constraint

Mingyue Ding; Igor Gyacskov; Xiaping Yuan; Maria Drangova; Donal B. Downey; Aaron Fenster

In the diagnosis and therapy of prostate cancer, it is critical to measure the volume of the prostate and locate its boundary. Three-dimensional transrectal ultrasound (3D TRUS) imaging has been demonstrated to be a useful technique to perform such a task. Due to image speckle as well as low contrast in ultrasound images, segmentation of the prostate in 3D US images is challenging. In this paper, we report on the development of an improved slice-based 3D prostate segmentation method. First, we imposed a continuity constraint for the end points of the prostate boundaries in a cross-sectional plane so that a smooth prostate boundary in 2D is obtained. Then, in each 2D slice, we inserted the end points into the vertex list of the initial contour to obtain a new contour, which forces the evolving contour to be driven to the boundary of the prostate. Evaluation demonstrated that our method could segment the prostate in 3D TRUS images more quickly and accurately


medical image computing and computer assisted intervention | 2003

Projection-Based Needle Segmentation in 3D Ultrasound Images

Mingyue Ding; Aaron Fenster

Needles are used extensively in interventional procedures such as biopsy and brachytherapy. To deliver radioactive seeds to pre-planned positions or sample lesions from the region that may contain cancer cells, the 3D position of the needle must be determined accurately and quickly. Three-dimensional ultrasound (US) image guidance is an efficient technique used to perform this task. In this paper, we describe the development of a projection-based needle segmentation method comprising three steps. First, the 3D image is projected along an initial direction perpendicular to the approximate needle direction determined from the 3D imaging system. The needle is then segmented in a projected 2D image. Using the projection direction and the detected 2D needle direction, a plane containing the needle--called the needle plane--is determined. Secondly, the 3D image is re-projected in the direction perpendicular to the normal of the needle plane and step 1 is repeated. If the needle direction in the projected 2D image is horizontal, the needle plane is correct; otherwise, steps 1 and 2 are repeated until a correct needle plane is found. Thirdly, the 3D image is projected along the normal direction of the needle plane and the needle endpoints in the projected 2D image are determined. Using the relationship between the 3D projection and the 3D volume coordinate systems, the coordinates of the endpoints of the needle in the 3D US coordinate system are determined. Experiments with agar and turkey phantom 3D US images demonstrated that our method could segment the needle from 3D US images with an average accuracy of 0.7 mm in position and 1.2 degrees in orientation with a speed of 13 fps on a 1.3-GHz PC. In addition, experiments illustrated that our method is robust to variations in the initial estimated needle direction, the size of the cropped volume, and the ray-casting transfer function parameters used in pre-processing.


Archive | 2005

Visualization and Segmentation Techniques in 3D Ultrasound Images

Aaron Fenster; Mingyue Ding; Ning Hu; Hanif M. Ladak; Guokuan Li; Neale Cardinal; Donal B. Downey

Although ultrasonography is an important cost-effective imaging modality, technical improvements are needed before its full potential is realized for accurate and quantitative monitoring of disease progression or regression. 2D viewing of 3D anatomy, using conventional ultrasonography limits our ability to quantify and visualize pathology and is partly responsible for the reported variability in diagnosis and monitoring of disease progression. Efforts of investigators have focused on overcoming these deficiencies by developing 3D ultrasound imaging techniques using existing conventional ultrasound systems, reconstructing the information into 3D images, and then allowing interactive viewing of the 3D images on inexpensive desktop computers. In addition, the availability of 3D ultrasound images has allowed the development of automated and semi-automated segmentation techniques to quantify organ and pathology volume for monitoring of disease. In this chapter, we introduce the basic principles of 3D ultrasound imaging as well as its visualization techniques. Then, we describe the use of 3D ultrasound in interventional procedures and discuss applications of 3D segmentation techniques of the prostates, needles, and seeds used in prostate brachytherapy.


Medical Imaging 2004: Image Processing | 2004

Evaluation of algorithms for segmentation of the prostate boundary from 3D ultrasound images

Hanif M. Ladak; Mingyue Ding; Yunqiu Wang; Ning Hu; Donal B. Downey; Aaron Fenster

We evaluated three algorithms for prostate boundary segmentation from 3D ultrasound images. In the parallel segmentation method, the 3D image was sliced into parallel, contiguous 2D images, whereas in the rotational method, the image was sliced in a rotational manner. Using either method, four points were selected on a central slice and used to initiate a 2D deformable model. The segmented contour was propagated to adjacent slices until the entire prostate was segmented. In the volume-based method, the 3D image was segmented directly without slicing it. Each segmentation algorithm was applied to four 3D images, and the results were compared to manual segmentation. Average volume errors of -8.58%, -1.95% and -5.01% were estimated for the parallel, rotational and volume-based methods, respectively. Approximately 20% of the slices required editing in the parallel method, whereas 13% required editing in the rotational method. Although only one surface segmented using the volume-based method needed editing, manual editing was difficult in 3D. Segmentation times, including editing, ranged from 42 to 82 seconds for the parallel method, from 27 to 52 seconds for the rotational method, and up to 55 seconds for the volume-based method. Based on these results, we recommend the rotational segmentation method.


Archive | 2005

Apparatus and computing device for performing brachytherapy and methods of imaging using the same

Aaron Fenster; Lori Gardi; Donal B. Downey; Chandima Edirisinghe; Mingyue Ding

Collaboration


Dive into the Mingyue Ding's collaboration.

Top Co-Authors

Avatar

Aaron Fenster

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Donal B. Downey

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Zhouping Wei

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

H. Neale Cardinal

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Hanif M. Ladak

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Igor Gyacskov

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Ning Hu

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Yunqiu Wang

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Congjin Chen

Robarts Research Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge