Tracy L. Faber
University of Texas Southwestern Medical Center
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tracy L. Faber.
IEEE Transactions on Pattern Analysis and Machine Intelligence | 1988
Tracy L. Faber; E. M. Stokely
A tensor-based moment function method and a principal-axes method were investigated for registering 3-D test images to a standard image using translation, scale, and orientation. These methods were applied at two image resolutions to test discretization effects. At the higher resolution, both methods were found to perform well in cases where the test image could be described as an affine transform of the standard. At low resolutions, however, and when the test image was not an affine transform of the standard, only the principal-axes-based method performed adequately. The problem of quantifying the left ventricular function from gated blood pool single-photo emission computed tomographic images is considered. >
IEEE Transactions on Medical Imaging | 1991
Tracy L. Faber; E. M. Stokely; Ronald M. Peshock; James R. Corbett
The authors have developed a general model-based surface detector for finding the four-dimensional (three spatial dimensions plus time) endocardial and epicardial left ventricular boundaries. The model encoded left ventricular (LV) shape, smoothness, and connectivity into the compatibility coefficients of a relaxation labeling algorithm. This surface detection method was applied to gated single photon emission computed tomography (SPECT) perfusion images, tomographic radionuclide ventriculograms, and cardiac rotation magnetic resonance images. Its accuracy was investigated using actual patient data. Global left ventricular volumes correlated well, with a maximum correlation coefficient of 0.98 for magnetic resonance imaging (MRI) endocardial surfaces and a minimum of 0.88 for SPECT epicardial surfaces. The average absolute errors of edge detection were 6.4, 5.6. and 4.6 mm for tomographic radionuclide ventriculograms, gated perfusion SPECT, and magnetic resonance images, respectively.
IEEE Transactions on Medical Imaging | 1984
Tracy L. Faber; M. H. Lewis; James R. Corbett; E. M. Stokely
Most methods that have been proposed for attenuation compensation in single-photon emission computed tomography (SPECT) either rely on simplifying assumptions, or use slow iteration to achieve accuracy. Recently, hybrid methods which combine iteration with simple multiplicative correction have been proposed by Chang and by Moore et al. In this study we evaluated these methods using both simulated and real phantom data from a rotating gamma camera. Of special concern were the effects of assuming constant attenuation distributions for correction and of using only 180° of projection data in the reconstructions. Results were compared by means of image contrast, %RMS error, and a ¿2 error statistic. Simulations showed the hybrid methods to converge after 1-2 iterations when 360° data were used, less rapidly for 180° data. The Moore method was more accurate than our modified Chang method for 180° data. Phantom data indicated the importance of using an accurate attenuation map for both methods. The speed of convergence of the hybrid algorithms compared to traditional iterative techniques, and their accuracy in reconstructing photon activity, even with 180° data, makes them attractive for use in quantitative analysis of SPECT reconstructions.
IEEE Transactions on Medical Imaging | 1987
Tracy L. Faber; E. M. Stokely
Three methods for identifying the left ventricular apex in 3-D medical images of the heart called gated blood pool tomograms were investigated. The first method assumed a known orientation and positioning of the entire blood pool. The second and third methods used shapes described by quadratic surfaces, which are invariant to position and orientation. The first method performed best when the blood pool was accurately oriented, but as expected, could not handle blood pools in the wrong orientations. The quadratic surface methods performed well whether or not the blood pool was accurately oriented. The best quadratic surface method predicted the x, y, z value of the apex with correlations of 0.97, 0.98, 0.99.
visual communications and image processing | 1988
Tracy L. Faber; E. M. Stokely; James R. Corbett
A relaxation labelling model was implemented to detect 3-d endo- and epicardial surfaces in ECG gated single photon emission computed tomographic perfusion studies of the heart. The model was tested using studies from normal volunteers and from patients with coronary artery disease. LV volumes calculated from the abnormal patient studies correlated well with those from contrast ventriculography, r=0.82 and r=0.89 for end systole and end diastole, respectively. The ejection fractions correlated well with those from radionuclide ventriculograms (r=0.93.) For normal volunteers, left ventricular endocardial volumes calculated using relaxation labelling correlated to those computed from user-traced surfaces with r-0.98. The correlation between epicardial volumes computed using relaxation labelling and hand-traced edges was also 0.98.
Archive | 1992
Tracy L. Faber; James R. Corbett
Acquisition and quantification methods for tomographic radio nuclide ventriculograms (TRVG) are discussed. The approach takes advantage of the threedimensional data to measure left ventricular (LV) volumes and endocardial motion. The methods were validated using tomograms from normal volunteers; LV volumes and motion from the SPECT studies were compared to values computed from magnetic resonance (MR) images of the same persons. The clinical usefulness of the methods was evaluated by analyzing the TRVGs of 21 patients with known infarcts. Regional motion was compared to the computed normal values to determine areas of abnormally contracting tissue, and the results were compared to known infarct location. Ejection fractions computed from the TRVGs were compared to values calculated from planar radionuclide ventriculograms (RVG) for these 21 patients. The average error in motion measurements when TRVG and MR values were compared was -5 mm. The correlation between planar and TRVG ejection fractions was 0.94. Automatic analysis found 19 of the 21 abnormalities; the locations of all detected abnormalities and the known infarcted tissue corresponded correctly.
Medical Imaging V: Image Processing | 1991
Prakash Adiseshan; Tracy L. Faber
Classification of tissue-types in Magnetic Resonance (MR) images has received considerable attention in the medical image processing literature. Interpretation of MR images is based on multiple images corresponding to the same anatomy. Relaxation Labeling (RL) is a commonly used low-level technique in computer vision. We present a solution method for the classification of tissue-types in brain MR images using RL. Information from multiple images is combined to form an initial classification. RL is used to resolve the ambiguity present in the initial classification by incorporating user specified compatibility coefficients. A problem with RL is the smoothing of borders between tissue-types. We include edge information from the original images to overcome this problem. This results in a marked improvement in performance. We present results from patient images.
Medical Imaging VI: Image Processing | 1992
Prakash Adiseshan; Tracy L. Faber; Roderick McColl
We present a solution method for adaptively smoothing magnetic resonance (MR) images while preserving discontinuities. We assume that the spatial behavior of MR data can be captured by a first order polynomial defined at every pixel. The formulation itself is similar to Leclercs work on piecewise-smooth image segmentation, but we use the graduated non- convexity (GNC) algorithm as an optimizing tool for obtaining the solution. This requires initial values for polynomial coefficients of order greater than zero. These values are obtained by using ideas similar to that found in robust statistics. This initial step is also useful in determining the variance of the noise present in the input image. The variance is related to an important parameter (alpha) required by the GNC algorithm. Firstly, this replaces the heuristic nature of (alpha) with a quantity that can be estimated. Secondly, it is useful especially in situations where the variance of the noise is not uniform across the image. We present results on synthetic and MR images. Though the results of this paper are given using first order polynomials, the formulation can handle higher order polynomials.
The Journal of Nuclear Medicine | 1991
Tracy L. Faber; Marvin S. Akers; James R. Corbett
Radiology | 1991
Tracy L. Faber; Roderick McColl; Roger M. Opperman; James R. Corbett