Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alberto M. Biancardi is active.

Publication


Featured researches published by Alberto M. Biancardi.


international conference of the ieee engineering in medicine and biology society | 2009

A public image database to support research in computer aided diagnosis

Anthony P. Reeves; Alberto M. Biancardi; David F. Yankelevitz; Sergei V. Fotin; Brad M. Keller; Artit C. Jirapatnakul; James W. Lee

The Public Lung Database to address drug response (PLD) has been developed to support research in computer aided diagnosis (CAD). Originally established for applications involving the characterization of pulmonary nodules, the PLD has been augmented to provide initial datasets for CAD research of other diseases. In general, the best performance for a CAD system is achieved when it is trained with a large amount of well documented data. Such training databases are very expensive to create and their lack of general availability limits the targets that can be considered for CAD applications and hampers development of the CAD field. The approach taken with the PLD has been to make available small datasets together with both manual and automated documentation. Furthermore, datasets with special properties are provided either to span the range of task complexity or to provide small change repeat images for direct calibration and evaluation of CAD systems. This resource offers a starting point for other research groups wishing to pursue CAD research in new directions. It also provides an on-line reference for better defining the issues relating to specific CAD tasks.


Proceedings of SPIE | 2009

A multiscale Laplacian of Gaussian filtering approach to automated pulmonary nodule detection from whole-lung low-dose CT scans

Sergei V. Fotin; Anthony P. Reeves; Alberto M. Biancardi; David F. Yankelevitz; Claudia I. Henschke

The primary stage of a pulmonary nodule detection system is typically a candidate generator that efficiently provides the centroid location and size estimate of candidate nodules. A scale-normalized Laplacian of Gaussian (LOG) filtering method presented in this paper has been found to provide high sensitivity along with precise locality and size estimation. This approach involves a computationally efficient algorithm that is designed to identify all solid nodules in a whole lung anisotropic CT scan. This nodule candidate generator has been evaluated in conjunction with a set of discriminative features that target both isolated and attached nodules. The entire detection system was evaluated with respect to a sizeenriched dataset of 656 whole-lung low-dose CT scans containing 459 solid nodules with diameter greater than 4 mm. Using a soft margin SVM classifier, and setting false positive rate of 10 per scan, we obtained a sensitivity of 97% for isolated, 93% for attached, and 89% for both nodule types combined. Furthermore, the LOG filter was shown to have good agreement with the radiologist ground truth for size estimation.


PLOS ONE | 2013

Growth Pattern Analysis of Murine Lung Neoplasms by Advanced Semi-Automated Quantification of Micro-CT Images

Minxing Li; Artit C. Jirapatnakul; Alberto M. Biancardi; Mark L. Riccio; Robert S. Weiss; Anthony P. Reeves

Computed tomography (CT) is a non-invasive imaging modality used to monitor human lung cancers. Typically, tumor volumes are calculated using manual or semi-automated methods that require substantial user input, and an exponential growth model is used to predict tumor growth. However, these measurement methodologies are time-consuming and can lack consistency. In addition, the availability of datasets with sequential images of the same tumor that are needed to characterize in vivo growth patterns for human lung cancers is limited due to treatment interventions and radiation exposure associated with multiple scans. In this paper, we performed micro-CT imaging of mouse lung cancers induced by overexpression of ribonucleotide reductase, a key enzyme in nucleotide biosynthesis, and developed an advanced semi-automated algorithm for efficient and accurate tumor volume measurement. Tumor volumes determined by the algorithm were first validated by comparison with results from manual methods for volume determination as well as direct physical measurements. A longitudinal study was then performed to investigate in vivo murine lung tumor growth patterns. Individual mice were imaged at least three times, with at least three weeks between scans. The tumors analyzed exhibited an exponential growth pattern, with an average doubling time of 57.08 days. The accuracy of the algorithm in the longitudinal study was also confirmed by comparing its output with manual measurements. These results suggest an exponential growth model for lung neoplasms and establish a new advanced semi-automated algorithm to measure lung tumor volume in mice that can aid efforts to improve lung cancer diagnosis and the evaluation of therapeutic responses.


international conference of the ieee engineering in medicine and biology society | 2009

Automated nodule location and size estimation using a multi-scale laplacian of Gaussian filtering approach

Artit C. Jirapatnakul; Sergei V. Fotin; Anthony P. Reeves; Alberto M. Biancardi; David F. Yankelevitz; Claudia I. Henschke

Estimation of nodule location and size is an important pre-processing step in some nodule segmentation algorithms to determine the size and location of the region of interest. Ideally, such estimation methods will consistently find the same nodule location irregardless of where the the seed point (provided either manually or by a nodule detection algorithm) is placed relative to the “true” center of the nodule, and the size should be a reasonable estimate of the true nodule size. We developed a method that estimates nodule location and size using multi-scale Laplacian of Gaussian (LoG) filtering. Nodule candidates near a given seed point are found by searching for blob-like regions with high filter response. The candidates are then pruned according to filter response and location, and the remaining candidates are sorted by size and the largest candidate selected. This method was compared to a previously published template-based method. The methods were evaluated on the basis of stability of the estimated nodule location to changes in the initial seed point and how well the size estimates agreed with volumes determined by a semi-automated nodule segmentation method. The LoG method exhibited better stability to changes in the seed point, with 93% of nodules having the same estimated location even when the seed point was altered, compared to only 52% of nodules for the template-based method. Both methods also showed good agreement with sizes determined by a nodule segmentation method, with an average relative size difference of 5% and −5% for the LoG and template-based methods respectively.


computer assisted radiology and surgery | 2010

A comparison of ground truth estimation methods

Alberto M. Biancardi; Artit C. Jirapatnakul; Anthony P. Reeves

PurposeKnowledge of the exact shape of a lesion, or ground truth (GT), is necessary for the development of diagnostic tools by means of algorithm validation, measurement metric analysis, accurate size estimation. Four methods that estimate GTs from multiple readers’ documentations by considering the spatial location of voxels were compared: thresholded Probability-Map at 0.50 (TPM0.50) and at 0.75 (TPM0.75), simultaneous truth and performance level estimation (STAPLE) and truth estimate from self distances (TESD).MethodsA subset of the publicly available Lung Image Database Consortium archive was used, selecting pulmonary nodules documented by all four radiologists. The pair-wise similarities between the estimated GTs were analyzed by computing the respective Jaccard coefficients. Then, with respect to the readers’ marking volumes, the estimated volumes were ranked and the sign test of the differences between them was performed.Results(a) the rank variations among the four methods and the volume differences between STAPLE and TESD are not statistically significant, (b) TPM0.50 estimates are statistically larger (c) TPM0.75 estimates are statistically smaller (d) there is some spatial disagreement in the estimates as the one-sided 90% confidence intervals between TPM0.75 and TPM0.50, TPM0.75 and STAPLE, TPM0.75 and TESD, TPM0.50 and STAPLE, TPM0.50 and TESD, STAPLE and TESD, respectively, show: [0.67, 1.00], [0.67, 1.00], [0.77, 1.00], [0.93, 1.00], [0.85, 1.00], [0.85, 1.00].ConclusionsThe method used to estimate the GT is important: the differences highlighted that STAPLE and TESD, notwithstanding a few weaknesses, appear to be equally viable as a GT estimator, while the increased availability of computing power is decreasing the appeal afforded to TPMs. Ultimately, the choice of which GT estimation method, between the two, should be preferred depends on the specific characteristics of the marked data that is used with respect to the two elements that differentiate the method approaches: relative reliabilities of the readers and the reliability of the region boundaries.


international symposium on biomedical imaging | 2009

Semi-automated measurement of pulmonary nodule growth without explicit segmentation

Artit C. Jirapatnakul; Anthony P. Reeves; Alberto M. Biancardi; David F. Yankelevitz; Claudia I. Henschke

Many nodule measurement methods rely on accurate segmentation of the nodule and may fail with complex nodule morphologies; often slight variations in segmentation result in large volume differences. A method, growth analysis from density (GAD), is presented that measures nodule growth without explicit segmentation through the application of a Gaussian weighting function to a region around the nodule, avoiding the drawbacks of segmentation-based methods. The resulting mean density is used as a surrogate for volume when computing growth. A zero-change nodule dataset was used to establish the variability of the method, followed by testing on datasets of stable, malignant, and complex nodules. There was no significant difference in percent volume change between the methods (p=0.55), and while GAD showed similar measurement variability and discriminative performance as a segmentation-based method (GAS), it was able to successfully measure growth on complex nodules where GAS failed.


Proceedings of SPIE, the International Society for Optical Engineering | 2008

Characterization of pulmonary nodules: effects of size and feature type on reported performance

Artit C. Jirapatnakul; Anthony P. Reeves; Tatiyana V. Apanasovich; Alberto M. Biancardi; David F. Yankelevitz; Claudia I. Henschke

Differences in the size distribution of malignant and benign pulmonary nodules in databases used for training and testing characterization systems have a significant impact on the measured performance. The magnitude of this effect and methods to provide more relevant performance results are explored in this paper. Two- and three-dimensional features, both including and excluding size, and two classifiers, logistic regression and distance-weighted nearest-neighbors (dwNN), were evaluated on a database of 178 pulmonary nodules. For the full database, the area under the ROC curve (AUC) of the logistic regression classifier for 2D features with and without size was 0.721 and 0.614 respectively, and for 3D features with and without size, 0.773 and 0.737 respectively. In comparison, the performance using a simple size-threshold classifier was 0.675. In the second part of the study, the performance was measured on a subset of 46 nodules from the entire subset selected to have a similar size-distribution of malignant and benign nodules. For this subset, performance of the size-threshold was 0.504. For logistic regression, the performance for 2D, with and without size, were 0.578 and 0.478, and for 3D, with and without size, 0.671 and 0.767. Over all the databases, logistic regression exhibited better performance using 3D features than 2D features. This study suggests that in systems for nodule classification, size is responsible for a large part of the reported performance. To address this, system performance should be reported with respect to the performance of a size-threshold classifier.


international symposium on biomedical imaging | 2007

PULMONARY NODULE CLASSIFICATION: SIZE DISTRIBUTION ISSUES

Artit C. Jirapatnakul; Anthony P. Reeves; Tatiyana V. Apanasovich; Alberto M. Biancardi; David F. Yankelevitz; Claudia I. Henschke

Automated nodule classification systems determine a model based on features extracted from documented databases of nodules. These databases cover a large size range and have an unequal distribution of malignant and benign nodules, leading to a high correlation between malignancy and size. For two recent studies in the literature, much of the reported performance of the system may be derived from size based on analysis of their size distributions. We performed experiments to determine the effect of unequal size distribution on a nodule classification systems performance. Preliminary results indicate that the performance across the entire dataset (a sensitivity/specificity of 0.85/0.80) does not generalize to a subset of nodules (0.50/0.80), but performance can be improved by specifically training on that subset (0.60/0.80). Additional testing with larger datasets needs to be performed, but results reported in this area are overly optimistic.


Proceedings of SPIE | 2012

Heart region segmentation from low-dose CT scans: an anatomy based approach

Anthony P. Reeves; Alberto M. Biancardi; David Yankelevitz; Matthew D. Cham; Claudia I. Henschke

Cardiovascular disease is a leading cause of death in developed countries. The concurrent detection of heart diseases during low-dose whole-lung CT scans (LDCT), typically performed as part of a screening protocol, hinges on the accurate quantification of coronary calcification. The creation of fully automated methods is ideal as complete manual evaluation is imprecise, operator dependent, time consuming and thus costly. The technical challenges posed by LDCT scans in this context are mainly twofold. First, there is a high level image noise arising from the low radiation dose technique. Additionally, there is a variable amount of cardiac motion blurring due to the lack of electrocardiographic gating and the fact that heart rates differ between human subjects. As a consequence, the reliable segmentation of the heart, the first stage toward the implementation of morphologic heart abnormality detection, is also quite challenging. An automated computer method based on a sequential labeling of major organs and determination of anatomical landmarks has been evaluated on a public database of LDCT images. The novel algorithm builds from a robust segmentation of the bones and airways and embodies a stepwise refinement starting at the top of the lungs where image noise is at its lowest and where the carina provides a good calibration landmark. The segmentation is completed at the inferior wall of the heart where extensive image noise is accommodated. This method is based on the geometry of human anatomy and does not involve training through manual markings. Using visual inspection by an expert reader as a gold standard, the algorithm achieved successful heart and major vessel segmentation in 42 of 45 low-dose CT images. In the 3 remaining cases, the cardiac base was over segmented due to incorrect hemidiaphragm localization.


Journal of medical imaging | 2014

Three-dimensional DNA image cytometry by optical projection tomographic microscopy for early cancer diagnosis

Nitin Agarwal; Alberto M. Biancardi; Florence W. Patten; Anthony P. Reeves; Eric J. Seibel

Abstract. Aneuploidy is typically assessed by flow cytometry (FCM) and image cytometry (ICM). We used optical projection tomographic microscopy (OPTM) for assessing cellular DNA content using absorption and fluorescence stains. OPTM combines some of the attributes of both FCM and ICM and generates isometric high-resolution three-dimensional (3-D) images of single cells. Although the depth of field of the microscope objective was in the submicron range, it was extended by scanning the objective’s focal plane. The extended depth of field image is similar to a projection in a conventional x-ray computed tomography. These projections were later reconstructed using computed tomography methods to form a 3-D image. We also present an automated method for 3-D nuclear segmentation. Nuclei of chicken, trout, and triploid trout erythrocyte were used to calibrate OPTM. Ratios of integrated optical densities extracted from 50 images of each standard were compared to ratios of DNA indices from FCM. A comparison of mean square errors with thionin, hematoxylin, Feulgen, and SYTOX green was done. Feulgen technique was preferred as it showed highest stoichiometry, least variance, and preserved nuclear morphology in 3-D. The addition of this quantitative biomarker could further strengthen existing classifiers and improve early diagnosis of cancer using 3-D microscopy.

Collaboration


Dive into the Alberto M. Biancardi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David F. Yankelevitz

Icahn School of Medicine at Mount Sinai

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge