Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Senthil Periaswamy is active.

Publication


Featured researches published by Senthil Periaswamy.


Medical Physics | 2009

Registration of prone and supine CT colonography scans using correlation optimized warping and canonical correlation analysis

Shijun Wang; Jianhua Yao; Jiamin Liu; Nicholas Petrick; Robert L. Van Uitert; Senthil Periaswamy; Ronald M. Summers

PURPOSE In computed tomographic colonography (CTC), a patient will be scanned twice-Once supine and once prone-to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. METHODS We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined by the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. RESULTS We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27 +/- 52.97 to 14.98 mm +/- 11.41 mm, compared to the normalized distance along the colon centerline algorithm (p < 0.01). CONCLUSIONS The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline.


American Journal of Roentgenology | 2013

Fully Automated Prostate Segmentation on MRI: Comparison With Manual Segmentation Methods and Specimen Volumes

Baris Turkbey; Sergei V. Fotin; Robert Huang; Yin Yin; Dagane Daar; Omer Aras; Marcelino Bernardo; Brian Garvey; Juanita Weaver; Hrishikesh Haldankar; Naira Muradyan; Maria J. Merino; Peter A. Pinto; Senthil Periaswamy; Peter L. Choyke

OBJECTIVE The objective of our study was to compare calculated prostate volumes derived from tridimensional MR measurements (ellipsoid formula), manual segmentation, and a fully automated segmentation system as validated by actual prostatectomy specimens. MATERIALS AND METHODS Ninety-eight consecutive patients (median age, 60.6 years; median prostate-specific antigen [PSA] value, 6.85 ng/mL) underwent triplane T2-weighted MRI on a 3-T magnet with an endorectal coil while undergoing diagnostic workup for prostate cancer. Prostate volume estimates were determined using the formula for ellipsoid volume based on tridimensional measurements, manual segmentation of triplane MRI, and automated segmentation based on normalized gradient fields cross-correlation and graph-search refinement. Estimates of prostate volume based on ellipsoid volume, manual segmentation, and automated segmentation were compared with prostatectomy specimen volumes. Prostate volume estimates were compared using the Pearson correlation coefficient and linear regression analysis. The Dice similarity coefficient was used to quantify spatial agreement between manual segmentation and automated segmentation. RESULTS The Pearson correlation coefficient revealed strong positive correlation between prostatectomy specimen volume and prostate volume estimates derived from manual segmentation (R = 0.89-0.91, p < 0.0001) and automated segmentation (R = 0.88-0.91, p < 0.0001). No difference was observed between manual segmentation and automated segmentation. Mean partial and full Dice similarity coefficients of 0.92 and 0.89, respectively, were achieved for axial automated segmentation. CONCLUSION Prostate volume estimates obtained with a fully automated 3D segmentation tool based on normalized gradient fields cross-correlation and graph-search refinement can yield highly accurate prostate volume estimates in a clinically relevant time of 10 seconds. This tool will assist in developing a broad range of applications including routine prostate volume estimations, image registration, biopsy guidance, and decision support systems.


Diagnostic and interventional radiology | 2014

Clinical value of prostate segmentation and volume determination on MRI in benign prostatic hyperplasia.

Brian Garvey; Baris Turkbey; Hong Truong; Marcelino Bernardo; Senthil Periaswamy; Peter L. Choyke

Benign prostatic hyperplasia (BPH) is a nonmalignant pathological enlargement of the prostate, which occurs primarily in the transitional zone. BPH is highly prevalent and is a major cause of lower urinary tract symptoms in aging males, although there is no direct relationship between prostate volume and symptom severity. The progression of BPH can be quantified by measuring the volumes of the whole prostate and its zones, based on image segmentation on magnetic resonance imaging. Prostate volume determination via segmentation is a useful measure for patients undergoing therapy for BPH. However, prostate segmentation is not widely used due to the excessive time required for even experts to manually map the margins of the prostate. Here, we review and compare new methods of prostate volume segmentation using both manual and automated methods, including the ellipsoid formula, manual planimetry, and semiautomated and fully automated segmentation approaches. We highlight the utility of prostate segmentation in the clinical context of assessing BPH.


Proceedings of SPIE | 2012

Fully automated 3D prostate central gland segmentation in MR images: a LOGISMOS based approach

Yin Yin; Sergei V. Fotin; Senthil Periaswamy; Justin Kunz; Hrishikesh Haldankar; Naira Muradyan; Baris Turkbey; Peter L. Choyke

One widely accepted classification of a prostate is by a central gland (CG) and a peripheral zone (PZ). In some clinical applications, separating CG and PZ from the whole prostate is useful. For instance, in prostate cancer detection, radiologist wants to know in which zone the cancer occurs. Another application is for multiparametric MR tissue characterization. In prostate T2 MR images, due to the high intensity variation between CG and PZ, automated differentiation of CG and PZ is difficult. Previously, we developed an automated prostate boundary segmentation system, which tested on large datasets and showed good performance. Using the results of the pre-segmented prostate boundary, in this paper, we proposed an automated CG segmentation algorithm based on Layered Optimal Graph Image Segmentation of Multiple Objects and Surfaces (LOGISMOS). The designed LOGISMOS model contained both shape and topology information during deformation. We generated graph cost by training classifiers and used coarse-to-fine search. The LOGISMOS framework guarantees optimal solution regarding to cost and shape constraint. A five-fold cross-validation approach was applied to training dataset containing 261 images to optimize the system performance and compare with a voxel classification based reference approach. After the best parameter settings were found, the system was tested on a dataset containing another 261 images. The mean DSC of 0.81 for the test set indicates that our approach is promising for automated CG segmentation. Running time for the system is about 15 seconds.


Proceedings of SPIE | 2012

Fully automated prostate segmentation in 3D MR based on normalized gradient fields cross-correlation initialization and LOGISMOS refinement

Yin Yin; Sergei V. Fotin; Senthil Periaswamy; Justin Kunz; Hrishikesh Haldankar; Naira Muradyan; F. Cornud; Baris Turkbey; Peter L. Choyke

Manual delineation of the prostate is a challenging task for a clinician due to its complex and irregular shape. Furthermore, the need for precisely targeting the prostate boundary continues to grow. Planning for radiation therapy, MR-ultrasound fusion for image-guided biopsy, multi-parametric MRI tissue characterization, and context-based organ retrieval are examples where accurate prostate delineation can play a critical role in a successful patient outcome. Therefore, a robust automated full prostate segmentation system is desired. In this paper, we present an automated prostate segmentation system for 3D MR images. In this system, the prostate is segmented in two steps: the prostate displacement and size are first detected, and then the boundary is refined by a shape model. The detection approach is based on normalized gradient fields cross-correlation. This approach is fast, robust to intensity variation and provides good accuracy to initialize a prostate mean shape model. The refinement model is based on a graph-search based framework, which contains both shape and topology information during deformation. We generated the graph cost using trained classifiers and used coarse-to-fine search and region-specific classifier training. The proposed algorithm was developed using 261 training images and tested on another 290 cases. The segmentation performance using mean DSC ranging from 0.89 to 0.91 depending on the evaluation subset demonstrates state of the art performance. Running time for the system is about 20 to 40 seconds depending on image size and resolution.


Proceedings of SPIE | 2016

Detection of soft tissue densities from digital breast tomosynthesis: comparison of conventional and deep learning approaches

Sergei V. Fotin; Yin Yin; Hrishikesh Haldankar; Jeffrey W. Hoffmeister; Senthil Periaswamy

Computer-aided detection (CAD) has been used in screening mammography for many years and is likely to be utilized for digital breast tomosynthesis (DBT). Higher detection performance is desirable as it may have an impact on radiologists decisions and clinical outcomes. Recently the algorithms based on deep convolutional architectures have been shown to achieve state of the art performance in object classification and detection. Similarly, we trained a deep convolutional neural network directly on patches sampled from two-dimensional mammography and reconstructed DBT volumes and compared its performance to a conventional CAD algorithm that is based on computation and classification of hand-engineered features. The detection performance was evaluated on the independent test set of 344 DBT reconstructions (GE SenoClaire 3D, iterative reconstruction algorithm) containing 328 suspicious and 115 malignant soft tissue densities including masses and architectural distortions. Detection sensitivity was measured on a region of interest (ROI) basis at the rate of five detection marks per volume. Moving from conventional to deep learning approach resulted in increase of ROI sensitivity from 0:832 ± 0:040 to 0:893 ± 0:033 for suspicious ROIs; and from 0:852 ± 0:065 to 0:930 ± 0:046 for malignant ROIs. These results indicate the high utility of deep feature learning in the analysis of DBT data and high potential of the method for broader medical image analysis tasks.


international conference of the ieee engineering in medicine and biology society | 2012

Matching 3-D Prone and Supine CT Colonography Scans Using Graphs

Shijun Wang; Nicholas Petrick; R. L. Van Uitert; Senthil Periaswamy; Zhuoshi Wei; Ronald M. Summers

In this paper, we propose a new registration method for prone and supine computed tomographic colonography scans using graph matching. We formulate 3-D colon registration as a graph matching problem and propose a new graph matching algorithm based on mean field theory. In the proposed algorithm, we solve the matching problem in an iterative way. In each step, we use mean field theory to find the matched pair of nodes with highest probability. During iterative optimization, one-to-one matching constraints are added to the system in a step-by-step approach. Prominent matching pairs found in previous iterations are used to guide subsequent mean field calculations. The proposed method was found to have the best performance with smallest standard deviation compared with two other baseline algorithms called the normalized distance along the colon centerline (NDACC) ( p = 0.17) with manual colon centerline correction and spectral matching ( p <; 1e-5). A major advantage of the proposed method is that it is fully automatic and does not require defining a colon centerline for registration. For the latter NDACC method, user interaction is almost always needed for identifying the colon centerlines.


international conference on image processing | 2010

Graph matching based on mean field theory

Shijun Wang; Nicholas Petrick; Robert L. Van Uitert; Senthil Periaswamy; Ronald M. Summers

In this paper, we propose a new graph matching algorithm based on mean field theory. We first convert the original graph matching problem which is a quadratic integer programming problem to a spin model with quadratic interaction by dropping the matching constraints. Then the matching constraints are added to the system iteratively after each round of mean field calculation. Prominent matching pairs found in previous iterations will guide the mean field calculation in the next round. Experiments on the CMU house dataset and a CTC dataset show promising matching results.


international symposium on biomedical imaging | 2012

Supine and prone CT colonography registration by matching graphs of teniae coli

Zhuoshi Wei; Shijun Wang; Nicholas Petrick; Jianhua Yao; Senthil Periaswamy; Ronald M. Summers

This paper proposes a registration method for supine and prone CTC scans. The method matches graphs built using the teniae coli, three muscles that run the length of the colon. The teniae are visible on CTC and were detected using fully-automatically software. Then key points of the teniae were obtained by non-uniformed sampling of the teniae. Graphs were built using these key points. The colon registration was formulated as a graph matching problem. Mean field theory was applied to match the graphs. The proposed method was tested on 10 pairs of supine and prone CTC scans. The average registration error was 2.5cm (±0.7 cm, 95% C.I. [2.1 2.9]), significantly improving the baseline graph matching method for CTC registration.


Proceedings of SPIE | 2011

3D supine and prone colon registration for computed tomographic colonography scans based on graph matching

Shijun Wang; Nicholas Petrick; Robert L. Van Uitert; Senthil Periaswamy; Ronald M. Summers

In this paper, we propose a new registration method for supine and prone computed tomographic colonography scans based on graph matching. We first formulated 3D colon registration as a graph matching problem and utilized a graph matching algorithm based on mean field theory. During the iterative optimization process, one-to-one matching constraints were added to the system step-by-step. Prominent matching pairs found in previous iterations are used to guide subsequent mean field calculations. The advantage of the proposed method is that it does not require a colon centerline for registration. We tested the algorithm on a CTC dataset of 19 patients with 19 polyps. The average registration error of the proposed method was 4.0cm (std. 2.1cm). The 95% confidence intervals were [3.0cm, 5.0mm]. There was no significant difference between the proposed method and our previous method based on the normalized distance along the colon centerline (p=0.1).

Collaboration


Dive into the Senthil Periaswamy's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ronald M. Summers

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Baris Turkbey

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Nicholas Petrick

Food and Drug Administration

View shared research outputs
Top Co-Authors

Avatar

Peter L. Choyke

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Shijun Wang

National Institutes of Health

View shared research outputs
Researchain Logo
Decentralizing Knowledge