Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Derek W. Cool is active.

Publication


Featured researches published by Derek W. Cool.


Medical Physics | 2008

Mechanically assisted 3D ultrasound guided prostate biopsy system

Jeffrey Bax; Derek W. Cool; Lori Gardi; Kerry Knight; David Smith; Jacques Montreuil; Shi Sherebrin; Cesare Romagnoli; Aaron Fenster

There are currently limitations associated with the prostate biopsy procedure, which is the most commonly used method for a definitive diagnosis of prostate cancer. With the use of two-dimensional (2D) transrectal ultrasound (TRUS) for needle-guidance in this procedure, the physician has restricted anatomical reference points for guiding the needle to target sites. Further, any motion of the physicians hand during the procedure may cause the prostate to move or deform to a prohibitive extent. These variations make it difficult to establish a consistent reference frame for guiding a needle. We have developed a 3D navigation system for prostate biopsy, which addresses these shortcomings. This system is composed of a 3D US imaging subsystem and a passive mechanical arm to minimize prostate motion. To validate our prototype, a series of experiments were performed on prostate phantoms. The 3D scan of the string phantom produced minimal geometric distortions, and the geometric error of the 3D imaging subsystem was 0.37 mm. The accuracy of 3D prostate segmentation was determined by comparing the known volume in a certified phantom to a reconstructed volume generated by our system and was shown to estimate the volume with less then 5% error. Biopsy needle guidance accuracy tests in agar prostate phantoms showed that the mean error was 2.1 mm and the 3D location of the biopsy core was recorded with a mean error of 1.8 mm. In this paper, we describe the mechanical design and validation of the prototype system using an in vitro prostate phantom. Preliminary results from an ongoing clinical trial show that prostate motion is small with an in-plane displacement of less than 1 mm during the biopsy procedure.


Medical Physics | 2010

Assessment of image registration accuracy in three-dimensional transrectal ultrasound guided prostate biopsy

Vaishali Karnik; Aaron Fenster; Jeffrey Bax; Derek W. Cool; Lori Gardi; I. Gyacskov; Cesare Romagnoli; Aaron D. Ward

PURPOSE Prostate biopsy, performed using two-dimensional (2D) transrectal ultrasound (TRUS) guidance, is the clinical standard for a definitive diagnosis of prostate cancer. Histological analysis of the biopsies can reveal cancerous, noncancerous, or suspicious, possibly precancerous, tissue. During subsequent biopsy sessions, noncancerous regions should be avoided, and suspicious regions should be precisely rebiopsied, requiring accurate needle guidance. It is challenging to precisely guide a needle using 2D TRUS due to the limited anatomic information provided, and a three-dimensional (3D) record of biopsy locations for use in subsequent biopsy procedures cannot be collected. Our tracked, 3D TRUS-guided prostate biopsy system provides additional anatomic context and permits a 3D record of biopsies. However, targets determined based on a previous biopsy procedure must be transformed during the procedure to compensate for intraprocedure prostate shifting due to patient motion and prostate deformation due to transducer probe pressure. Thus, registration is a critically important step required to determine these transformations so that correspondence is maintained between the prebiopsied image and the real-time image. Registration must not only be performed accurately, but also quickly, since correction for prostate motion and deformation must be carried out during the biopsy procedure. The authors evaluated the accuracy, variability, and speed of several surface-based and image-based intrasession 3D-to-3D TRUS image registration techniques, for both rigid and nonrigid cases, to find the required transformations. METHODS Our surface-based rigid and nonrigid registrations of the prostate were performed using the iterative-closest-point algorithm and a thin-plate spline algorithm, respectively. For image-based rigid registration, the authors used a block matching approach, and for nonrigid registration, the authors define the moving image deformation using a regular, 3D grid of B-spline control points. The authors measured the target registration error (TRE) as the postregistration misalignment of 60 manually marked, corresponding intrinsic fiducials. The authors also measured the fiducial localization error (FLE), the effect of segmentation variability, and the effect of fiducial distance from the transducer probe tip. Lastly, the authors performed 3D principal component analysis (PCA) on the x, y, and z components of the TREs to examine the 95% confidence ellipsoids describing the errors for each registration method. RESULTS Using surface-based registration, the authors found mean TREs of 2.13 +/- 0.80 and 2.09 +/- 0.77 mm for rigid and nonrigid techniques, respectively. Using image-based rigid and non-rigid registration, the authors found mean TREs of 1.74 +/- 0.84 and 1.50 +/- 0.83 mm, respectively. Our FLE was 0.21 mm and did not dominate the overall TRE. However, segmentation variability contributed substantially approximately50%) to the TRE of the surface-based techniques. PCA showed that the 95% confidence ellipsoid encompassing fiducial distances between the source and target registra- tion images was reduced from 3.05 to 0.14 cm3, and 0.05 cm3 for the surface-based and image-based techniques, respectively. The run times for both registration methods were comparable at less than 60 s. CONCLUSIONS Our results compare favorably with a clinical need for a TRE of less than 2.5 mm, and suggest that image-based registration is superior to surface-based registration for 3D TRUS-guided prostate biopsies, since it does not require segmentation.


Medical Physics | 2008

Design and evaluation of a 3D transrectal ultrasound prostate biopsy system.

Derek W. Cool; Shi Sherebrin; Jonathan I. Izawa; Joseph L. Chin; Aaron Fenster

Biopsy of the prostate using ultrasound guidance is the clinical gold standard for diagnosis of prostate adenocarcinoma. The current prostate biopsy procedure is limited to using 2D transrectal ultrasound (TRUS) images to target biopsy sites and record biopsy core locations for postbiopsy confirmation. Localization of the 2D image in its actual 3D position is ambiguous and limits procedural accuracy and reproducibility. We have developed a 3D TRUS prostate biopsy system that provides 3D intrabiopsy information for needle guidance and biopsy location recording. The system conforms to the workflow and imaging technology of the current biopsy procedure, making it easier for clinical integration. In this paper, we describe the system design and validate the system accuracy by performing mock biopsies on US/CT multimodal patient-specific prostate phantoms. Our biopsy system generated 3D patient-specific models of the prostate with volume errors less than 3.5% and mean boundary errors of less than 1 mm. Using the 3D biopsy system, needles were guided to within 2.3 +/- 1.0 mm of 3D targets and with a high probability of biopsying clinically significant tumors. The positions of the actual biopsy sites were accurately localized to within 1.5 +/- 0.8 mm.


American Journal of Roentgenology | 2015

Evaluation of MRI-TRUS Fusion Versus Cognitive Registration Accuracy for MRI-Targeted, TRUS-Guided Prostate Biopsy

Derek W. Cool; Xuli Zhang; Cesare Romagnoli; Jonathan I. Izawa; Walter Romano; Aaron Fenster

OBJECTIVE The purpose of this article is to compare transrectal ultrasound (TRUS) biopsy accuracies of operators with different levels of prostate MRI experience using cognitive registration versus MRI-TRUS fusion to assess the preferred method of TRUS prostate biopsy for MRI-identified lesions. SUBJECTS AND METHODS; One hundred patients from a prospective prostate MRI-TRUS fusion biopsy study were reviewed to identify all patients with clinically significant prostate adenocarcinoma (PCA) detected on MRI-targeted biopsy. Twenty-five PCA tumors were incorporated into a validated TRUS prostate biopsy simulator. Three prostate biopsy experts, each with different levels of experience in prostate MRI and MRI-TRUS fusion biopsy, performed a total of 225 simulated targeted biopsies on the MRI lesions as well as regional biopsy targets. Simulated biopsies performed using cognitive registration with 2D TRUS and 3D TRUS were compared with biopsies performed under MRI-TRUS fusion. RESULTS Two-dimensional and 3D TRUS sampled only 48% and 45% of clinically significant PCA MRI lesions, respectively, compared with 100% with MRI-TRUS fusion. Lesion sampling accuracy did not statistically significantly vary according to operator experience or tumor volume. MRI-TRUS fusion-naïve operators showed consistent errors in targeting of the apex, midgland, and anterior targets, suggesting that there is biased error in cognitive registration. The MRI-TRUS fusion expert correctly targeted the prostate apex; however, his midgland and anterior mistargeting was similar to that of the less-experienced operators. CONCLUSION MRI-targeted TRUS-guided prostate biopsy using cognitive registration appears to be inferior to MRI-TRUS fusion, with fewer than 50% of clinically significant PCA lesions successfully sampled. No statistically significant difference in biopsy accuracy was seen according to operator experience with prostate MRI or MRI-TRUS fusion.


Medical Physics | 2013

2D-3D rigid registration to compensate for prostate motion during 3D TRUS-guided biopsy.

Tharindu De Silva; Aaron Fenster; Derek W. Cool; Lori Gardi; Cesare Romagnoli; Jagath Samarabandu; Aaron D. Ward

PURPOSE Three-dimensional (3D) transrectal ultrasound (TRUS)-guided systems have been developed to improve targeting accuracy during prostate biopsy. However, prostate motion during the procedure is a potential source of error that can cause target misalignments. The authors present an image-based registration technique to compensate for prostate motion by registering the live two-dimensional (2D) TRUS images acquired during the biopsy procedure to a preacquired 3D TRUS image. The registration must be performed both accurately and quickly in order to be useful during the clinical procedure. METHODS The authors implemented an intensity-based 2D-3D rigid registration algorithm optimizing the normalized cross-correlation (NCC) metric using Powells method. The 2D TRUS images acquired during the procedure prior to biopsy gun firing were registered to the baseline 3D TRUS image acquired at the beginning of the procedure. The accuracy was measured by calculating the target registration error (TRE) using manually identified fiducials within the prostate; these fiducials were used for validation only and were not provided as inputs to the registration algorithm. They also evaluated the accuracy when the registrations were performed continuously throughout the biopsy by acquiring and registering live 2D TRUS images every second. This measured the improvement in accuracy resulting from performing the registration, continuously compensating for motion during the procedure. To further validate the method using a more challenging data set, registrations were performed using 3D TRUS images acquired by intentionally exerting different levels of ultrasound probe pressures in order to measure the performance of our algorithm when the prostate tissue was intentionally deformed. In this data set, biopsy scenarios were simulated by extracting 2D frames from the 3D TRUS images and registering them to the baseline 3D image. A graphics processing unit (GPU)-based implementation was used to improve the registration speed. They also studied the correlation between NCC and TREs. RESULTS The root-mean-square (RMS) TRE of registrations performed prior to biopsy gun firing was found to be 1.87 ± 0.81 mm. This was an improvement over 4.75 ± 2.62 mm before registration. When the registrations were performed every second during the biopsy, the RMS TRE was reduced to 1.63 ± 0.51 mm. For 3D data sets acquired under different probe pressures, the RMS TRE was found to be 3.18 ± 1.6 mm. This was an improvement from 6.89 ± 4.1 mm before registration. With the GPU based implementation, the registrations were performed with a mean time of 1.1 s. The TRE showed a weak correlation with the similarity metric. However, the authors measured a generally convex shape of the metric around the ground truth, which may explain the rapid convergence of their algorithm to accurate results. CONCLUSIONS Registration to compensate for prostate motion during 3D TRUS-guided biopsy can be performed with a measured accuracy of less than 2 mm and a speed of 1.1 s, which is an important step toward improving the targeting accuracy of a 3D TRUS-guided biopsy system.


Medical Physics | 2013

A 3D ultrasound scanning system for image guided liver interventions

Hamid Neshat; Derek W. Cool; Kevin Barker; Lori Gardi; Nirmal Kakani; Aaron Fenster

PURPOSE Two-dimensional ultrasound (2D US) imaging is commonly used for diagnostic and intraoperative guidance of interventional liver procedures; however, 2D US lacks volumetric information that may benefit interventional procedures. Over the past decade, three-dimensional ultrasound (3D US) has been developed to provide the missing spatial information. 3D US image acquisition is mainly based on mechanical, electromagnetic, and freehand tracking of conventional 2D US transducers, or 2D array transducers available on high-end machines. These approaches share many problems during clinical use for interventional liver imaging due to lack of flexibility and compatibility with interventional equipment, limited field-of-view (FOV), and significant capital cost compared to the benefits they introduce. In this paper, a novel system for mechanical 3D US scanning is introduced to address these issues. METHODS The authors have developed a handheld mechanical 3D US system that incorporates mechanical translation and tilt sector sweeping of any standard 2D US transducer to acquire 3D images. Each mechanical scanning function can be operated independently or may be combined to allow for a hybrid wide FOV acquisition. The hybrid motion mode facilitates registration of other modalities (e.g., CT or MRI) to the intraoperative 3D US images by providing a larger FOV in which to acquire anatomical information. The tilting mechanism of the developed mover allows image acquisition in the intercostal rib space to avoid acoustic shadowing from bone. The geometric and volumetric scanning validity of the 3D US system was evaluated on tissue mimicking US phantoms for different modes of operation. Identical experiments were performed on a commercially available 3D US system for direct comparison. To replicate a clinical scenario, the authors evaluated their 3D US system by comparing it to CT for measurement of angle and distance between interventional needles in different configurations, similar to those used for percutaneous ablation of liver tumors. RESULTS The mean geometrical hybrid 3D reconstruction error measured from scanning of a known string phantom was less than 1 mm in two directions and 2.5 mm in the scanning direction, which was comparable or better than the same measurements obtained from a commercially available 3D US system. The error in volume measurements of spherical phantom models depended on depth of the object. For a 20 cm(3) model at a depth of 15 cm, a standard depth for liver imaging, the mean error was 3.6% ± 4.5% comparable to the 2.3% ± 1.8% error for the 3D US commercial system. The error in 3D US measurement of the tip distance and angle between two microwave ablation antennas inserted into the phantom was 0.9 ± 0.5 mm and 1.1° ± 0.7°, respectively. CONCLUSIONS A 3D US system with hybrid scanning motions for large field-of-view 3D abdominal imaging has been developed and validated. The superior spatial information provided by 3D US might enhance image-guidance for percutaneous interventional treatment of liver malignancies. The system has potential to be integrated with other liver procedures and has application in other abdominal organs such as kidneys, spleen, or adrenals.


MICCAI'11 Proceedings of the 2011 international conference on Prostate cancer imaging: image analysis and image-guided interventions | 2011

Fusion of MRI to 3D TRUS for mechanically-assisted targeted prostate biopsy: system design and initial clinical experience

Derek W. Cool; Jeff Bax; Cesare Romagnoli; Aaron D. Ward; Lori Gardi; Vaishali Karnik; Jonathan I. Izawa; Joseph L. Chin; Aaron Fenster

This paper presents a mechanically-assisted 3D transrectal ultrasound (TRUS) biopsy system with MRI-3D TRUS fusion. The 3D TRUS system employs a 4 degree-of-freedom linkage for real-time TRUS probe tracking. MRI-TRUS fusion is achieved using a surface-based nonlinear registration incorporating thin-plate splines to provide real-time overlays of suspicious MRI lesions on 3D TRUS for intrabiopsy targeted needle guidance. Clinical use of the system is demonstrated on a prospective cohort study of 25 patients with clinical findings concerning for prostate adenocarcinoma (PCa). The MRI-3D TRUS registration accuracy is quantified and compared with alternative algorithms for optimal performance. Results of the clinical study demonstrated a significantly higher rate of positive biopsy cores and a higher Gleason score cancer grading for targeted biopsies using the MRI-3D TRUS fusion as compared to the standard 12-core sextant biopsy distribution. Lesion targeted biopsy cores that were positive for PCa contained a significantly higher percentage of tumor within each biopsy sample compared to the sextant cores and in some patients resulted in identifying higher risk disease.


Radiology | 2010

Repeat Prostate Biopsy Accuracy: Simulator-based Comparison of Two- and Three-dimensional Transrectal US Modalities

Derek W. Cool; Michael J. Connolly; Shi Sherebrin; Roy Eagleson; Jonathan I. Izawa; Justin Amann; Cesare Romagnoli; Walter Romano; Aaron Fenster

PURPOSE To compare the accuracy of biopsy with two-dimensional (2D) transrectal ultrasonography (US) with that of biopsy with conventional three-dimensional (3D) transrectal US and biopsy with guided 3D transrectal US in the guidance of repeat prostate biopsy procedures in a prostate biopsy simulator. MATERIALS AND METHODS The institutional review board approved this retrospective study. Five residents and five experts performed repeat biopsies with a biopsy simulator that contained the transrectal US prostate images of 10 patients who had undergone biopsy. Simulated repeat biopsies were performed with 2D transrectal US, conventional 3D transrectal US, and guided 3D transrectal US (an extension of 3D transrectal US that enables active display of biopsy targets). The modalities were compared on the basis of time per biopsy and how accurately simulated repeat biopsies could be guided to specific targets. The probability for successful biopsy of a repeat target was calculated for each modality. RESULTS Guided 3D transrectal US was significantly (P < .01) more accurate for simulated biopsy of repeat targets than was 2D or 3D transrectal US, with a biopsy accuracy of 0.86 mm +/- 0.47 (standard deviation), 3.68 mm +/- 2.60, and 3.60 mm +/- 2.57, respectively. Experts had a 70% probability of sampling a prior biopsy target volume of 0.5 cm(3) with 2D transrectal US; however, the probability approached 100% with guided 3D transrectal US. Biopsy accuracy was not significantly different between experts and residents for any modality; however, experts were significantly (P < .05) faster than residents with each modality. CONCLUSION Repeat biopsy of the prostate with 2D transrectal US has limited accuracy. Compared with 2D transrectal US, the biopsy accuracy of both experts and residents improved with guided 3D transrectal US but did not improve with conventional 3D transrectal US.


Medical Physics | 2010

Temporal‐based needle segmentation algorithm for transrectal ultrasound prostate biopsy procedures

Derek W. Cool; Lori Gardi; Cesare Romagnoli; Manale Saikaly; Jonathan I. Izawa; Aaron Fenster

PURPOSE Automatic identification of the biopsy-core tissue location during a prostate biopsy procedure would provide verification that targets were adequately sampled and would allow for appropriate intraprocedure biopsy target modification. Localization of the biopsy core requires accurate segmentation of the biopsy needle and needle tip from transrectal ultrasound (TRUS) biopsy images. A temporal-based TRUS needle segmentation algorithm was developed specifically for the prostate biopsy procedure to automatically identify the TRUS image containing the biopsy needle from a collection of 2D TRUS images and to segment the biopsy-core location from the 2D TRUS image. METHODS The temporal-based segmentation algorithm performs a temporal analysis on a series of biopsy TRUS images collected throughout needle insertion and withdrawal. Following the identification of points of needle insertion and retraction, the needle axis is segmented using a Hough transform-based algorithm, which is followed by a temporospectral TRUS analysis to identify the biopsy-needle tip. Validation of the temporal-based algorithm is performed on 108 TRUS biopsy sequences collected from the procedures of ten patients. The success of the temporal search to identify the proper images was manually assessed, while the accuracies of the needle-axis and needle-tip segmentations were quantitatively compared to implementations of two other needle segmentation algorithms within the literature. RESULTS The needle segmentation algorithm demonstrated a >99% accuracy in identifying the TRUS image at the moment of needle insertion from the collection of real-time TRUS images throughout the insertion and withdrawal of the biopsy needle. The segmented biopsy-needle axes were accurate to within 2.3 +/- 2.0 degrees and 0.48 +/- 0.42 mm of the gold standard. Identification of the needle tip to within half of the biopsy-core length (<10 mm) was 95% successful with a mean error of 2.4 +/- 4.0 mm. Needle-tip detection using the temporal-based algorithm was significantly more accurate (p < 0.001) than the other two algorithms tested, while the segmentation of the needle axis was not significantly different between the three algorithms. CONCLUSIONS The temporal-based needle segmentation algorithm accurately segments the location of the biopsy core from 2D TRUS images of clinical prostate biopsy procedures. The results for needle-tip localization demonstrated that the temporal-based algorithm is significantly more accurate than implementations of some existing needle segmentation algorithms within the literature.


international conference of the ieee engineering in medicine and biology society | 2014

3D ultrasound imaging in image-guided intervention.

Aaron Fenster; Jeff Bax; Hamid Neshat; Derek W. Cool; Nirmal Kakani; Cesare Romagnoli

Ultrasound imaging is used extensively in diagnosis and image-guidance for interventions of human diseases. However, conventional 2D ultrasound suffers from limitations since it can only provide 2D images of 3-dimensional structures in the body. Thus, measurement of organ size is variable, and guidance of interventions is limited, as the physician is required to mentally reconstruct the 3-dimensional anatomy using 2D views. Over the past 20 years, a number of 3-dimensional ultrasound imaging approaches have been developed. We have developed an approach that is based on a mechanical mechanism to move any conventional ultrasound transducer while 2D images are collected rapidly and reconstructed into a 3D image. In this presentation, 3D ultrasound imaging approaches will be described for use in image-guided interventions.

Collaboration


Dive into the Derek W. Cool's collaboration.

Top Co-Authors

Avatar

Aaron Fenster

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Aaron D. Ward

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Cesare Romagnoli

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Lori Gardi

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Jonathan I. Izawa

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Joseph L. Chin

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Eli Gibson

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shi Sherebrin

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Jeffrey Bax

University of Western Ontario

View shared research outputs
Researchain Logo
Decentralizing Knowledge