Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Harrison H. Barrett is active.

Publication


Featured researches published by Harrison H. Barrett.


Journal of The Optical Society of America A-optics Image Science and Vision | 1990

Objective assessment of image quality: effects of quantum noise and object variability

Harrison H. Barrett

A number of task-specific approaches to the assessment of image quality are treated. Both estimation and classification tasks are considered, but only linear estimators or classifiers are permitted. Performance on these tasks is limited by both quantum noise and object variability, and the effects of postprocessing or image-reconstruction algorithms are explicitly included. The results are expressed as signal-to-noise ratios (SNRs). The interrelationships among these SNRs are considered, and an SNR for a classification task is expressed as the SNR for a related estimation task times four factors. These factors show the effects of signal size and contrast, conspicuity of the signal, bias in the estimation task, and noise correlation. Ways of choosing and calculating appropriate SNRs for system evaluation and optimization are also discussed.


Journal of The Optical Society of America A-optics Image Science and Vision | 1987

Addition of a channel mechanism to the ideal-observer model

Kyle J. Myers; Harrison H. Barrett

Several authors have measured the detection ability of human observers for objects in correlated (nonwhite) noise. These studies have shown that the human observer has approximately constant efficiency when compared with a nonprewhitening ideal observer. In this paper we add a frequency-selective mechanism to the ideal-observer model, similar to the channel mechanism that has been demonstrated through experiments that measure a subjects ability to detect grating stimuli. For a number of detection and discrimination tasks, the nonprewhitening ideal-observer model and the channelized ideal-observer model yield similar performance predictions. Thus both models seem equally capable of explaining a considerable body of psychophysical data, and it would be difficult to devise an experiment to determine which model is more nearly correct.


Physics in Medicine and Biology | 1994

Noise properties of the EM algorithm. I. Theory

Harrison H. Barrett; Donald W. Wilson; B M W Tsui

The expectation-maximization (EM) algorithm is an important tool for maximum-likelihood (ML) estimation and image reconstruction, especially in medical imaging. It is a non-linear iterative algorithm that attempts to find the ML estimate of the object that produced a data set. The convergence of the algorithm and other deterministic properties are well established, but relatively little is known about how noise in the data influences noise in the final reconstructed image. In this paper we present a detailed treatment of these statistical properties. The specific application we have in mind is image reconstruction in emission tomography, but the results are valid for any application of the EM algorithm in which the data set can be described by Poisson statistics. We show that the probability density function for the grey level at a pixel in the image is well approximated by a log-normal law. An expression is derived for the variance of the grey level and for pixel-to-pixel covariance. The variance increases rapidly with iteration number at first, but eventually saturates as the ML estimate is approached. Moreover, the variance at any iteration number has a factor proportional to the square of the mean image (though other factors may also depend on the mean image), so a map of the standard deviation resembles the object itself. Thus low-intensity regions of the image tend to have low noise. By contrast, linear reconstruction methods, such as filtered back-projection in tomography, show a much more global noise pattern, with high-intensity regions of the object contributing to noise at rather distant low-intensity regions. The theoretical results of this paper depend on two approximations, but in the second paper in this series we demonstrate through Monte Carlo simulation that the approximations are justified over a wide range of conditions in emission tomography. The theory can, therefore, be used as a basis for calculation of objective figures of merit for image quality.


Physics Today | 1983

Radiological Imaging: The Theory of Image Formation, Detection, and Processing

Harrison H. Barrett; William Swindell; Robert Stanton

Preface to the Paperback Edition. Preface. List of Important Symbols. Named Functions. The Clinical Setting. Theory of Linear Systems. Theory of Random Processes. Application of Linear Systems Theory to Radiographic Imaging. Detectors. Classical Tomography. Computed Tomography. Multiplex Tomography. Three-Dimensional Imaging. Noise in Radiographic Images. Scattered Radiation. Appendix A: The Dirac Delta Function. Appendix B: The Fourier Transform. Appendix C: Interaction of Phtons with Matter. References. Author Index. Subject Index.


Journal of The Optical Society of America A-optics Image Science and Vision | 2001

Human- and model-observer performance in ramp-spectrum noise: effects of regularization and object variability

Craig K. Abbey; Harrison H. Barrett

We consider detection of a nodule signal profile in noisy images meant to roughly simulate the statistical properties of tomographic image reconstructions in nuclear medicine. The images have two sources of variability arising from quantum noise from the imaging process and anatomical variability in the ensemble of objects being imaged. Both of these sources of variability are simulated by a stationary Gaussian random process. Sample images from this process are generated by filtering white-noise images. Human-observer performance in several signal-known-exactly detection tasks is evaluated through psychophysical studies by using the two-alternative forced-choice method. The tasks considered investigate parameters of the images that influence both the signal profile and pixel-to-pixel correlations in the images. The effect of low-pass filtering is investigated as an approximation to regularization implemented by image-reconstruction algorithms. The relative magnitudes of the quantum and the anatomical variability are investigated as an approximation to the effects of exposure time. Finally, we study the effect of the anatomical correlations in the form of an anatomical slope as an approximation to the effects of different tissue types. Human-observer performance is compared with the performance of a number of model observers computed directly from the ensemble statistics of the images used in the experiments for the purpose of finding predictive models. The model observers investigated include a number of nonprewhitening observers, the Hotelling observer (which is equivalent to the ideal observer for these studies), and six implementations of channelized-Hotelling observers. The human observers demonstrate large effects across the experimental parameters investigated. In the regularization study, performance exhibits a mild peak at intermediate levels of regularization before degrading at higher levels. The exposure-time study shows that human observers are able to detect ever more subtle lesions at increased exposure times. The anatomical slope study shows that human-observer performance degrades as anatomical variability extends into higher spatial frequencies. Of the observers tested, the channelized-Hotelling observers best capture the features of the human data.


Journal of The Optical Society of America A-optics Image Science and Vision | 1992

Effect of random background inhomogeneity on observer detection performance

J. P. Rolland; Harrison H. Barrett

Many psychophysical studies of the ability of the human observer to detect a signal superimposed upon a uniform background, where both the signal and the background are known exactly, have been reported in the literature. In such cases, the ideal or the Bayesian observer is often used as a mathematical model of human performance since it can be readily calculated and is a good predictor of human performance for the task at hand. If, however, the background is spatially inhomogeneous (lumpy), the ideal observer becomes nonlinear, and its performance becomes difficult to evaluate. Since inhomogeneous backgrounds are commonly encountered in many practical applications, we have investigated the effects of background inhomogeneities on human performance. The task was detection of a two-dimensional Gaussian signal superimposed upon an inhomogeneous background and imaged through a pinhole imaging system. Poisson noise corresponding to a certain exposure time and aperture size was added to the detected image. A six-point rating scale technique was used to measure human performance as a function of the strength of the nonuniformities (lumpiness) in the background, the amount of blur of the imaging system, and the amount of Poisson noise in the image. The results of this study were compared with earlier theoretical predictions by Myers et al. [J. Opt. Soc. Am. A 7, 1279 (1990)] for two observer models: the optimum linear discriminant, also known as the Hotelling observer, and a nonprewhitening matched filter. Although the efficiency of the human observer relative to the Hotelling observer was only approximately 10%, the variation in human performance with respect to varying aperture size and exposure time was well predicted by the Hotelling model. The nonprewhitening model, on the other hand, fails to predict human performance in lumpy backgrounds in this study. In particular, this model predicts that performance will saturate with increasing exposure time and drop precipitously with increasing lumpiness; neither effect is observed with human observers.


Journal of The Optical Society of America A-optics Image Science and Vision | 1998

Objective assessment of image quality. III. ROC metrics, ideal observers, and likelihood-generating functions

Harrison H. Barrett; Craig K. Abbey; Eric Clarkson

We continue the theme of previous papers [J. Opt. Soc. Am. A 7, 1266 (1990); 12, 834 (1995)] on objective (task-based) assessment of image quality. We concentrate on signal-detection tasks and figures of merit related to the ROC (receiver operating characteristic) curve. Many different expressions for the area under an ROC curve (AUC) are derived for an arbitrary discriminant function, with different assumptions on what information about the discriminant function is available. In particular, it is shown that AUC can be expressed by a principal-value integral that involves the characteristic functions of the discriminant. Then the discussion is specialized to the ideal observer, defined as one who uses the likelihood ratio (or some monotonic transformation of it, such as its logarithm) as the discriminant function. The properties of the ideal observer are examined from first principles. Several strong constraints on the moments of the likelihood ratio or the log likelihood are derived, and it is shown that the probability density functions for these test statistics are intimately related. In particular, some surprising results are presented for the case in which the log likelihood is normally distributed under one hypothesis. To unify these considerations, a new quantity called the likelihood-generating function is defined. It is shown that all moments of both the likelihood and the log likelihood under both hypotheses can be derived from this one function. Moreover, the AUC can be expressed, to an excellent approximation, in terms of the likelihood-generating function evaluated at the origin. This expression is the leading term in an asymptotic expansion of the AUC; it is exact whenever the likelihood-generating function behaves linearly near the origin. It is also shown that the likelihood-generating function at the origin sets a lower bound on the AUC in all cases.


Journal of The Optical Society of America A-optics Image Science and Vision | 1995

Objective assessment of image quality. II. Fisher information, Fourier crosstalk, and figures of merit for task performance.

Harrison H. Barrett; J. L. Denny; Robert F. Wagner; Kyle J. Myers

Figures of merit for image quality are derived on the basis of the performance of mathematical observers on specific detection and estimation tasks. The tasks include detection of a known signal superimposed on a known background, detection of a known signal on a random background, estimation of Fourier coefficients of the object, and estimation of the integral of the object over a specified region of interest. The chosen observer for the detection tasks is the ideal linear discriminant, which we call the Hotelling observer. The figures of merit are based on the Fisher information matrix relevant to estimation of the Fourier coefficients and the closely related Fourier crosstalk matrix introduced earlier by Barrett and Gifford [Phys. Med. Biol. 39, 451 (1994)]. A finite submatrix of the infinite Fisher information matrix is used to set Cramer-Rao lower bounds on the variances of the estimates of the first N Fourier coefficients. The figures of merit for detection tasks are shown to be closely related to the concepts of noise-equivalent quanta (NEQ) and generalized NEQ, originally derived for linear, shift-invariant imaging systems and stationary noise. Application of these results to the design of imaging systems is discussed.


ieee nuclear science symposium | 2002

FastSPECT II: a second-generation high-resolution dynamic SPECT imager

Lars R. Furenlid; Donald W. Wilson; Yichun Chen; Hyunki Kim; P.J. Pictraski; M.J. Crawford; Harrison H. Barrett

FastSPECT II is a recently commissioned 16-camera small-animal SPECT imager built with modular scintillation cameras and list-mode data-acquisition electronics. The instrument is housed in a lead-shielded enclosure and has exchangeable aperture assemblies and adjustable camera positions for selection of magnification, pinhole size, and field of view. The calibration of individual cameras and measurement of an overall system imaging matrix (1 mm/sup 3/ voxels) are supported via a five-axis motion-control system. Details of the system integration and results of characterization and performance measurements are presented along with first tomographic images. The dynamic imaging capabilities of the instrument are explored and discussed.


Journal of The Optical Society of America A-optics Image Science and Vision | 1997

List-mode likelihood

Harrison H. Barrett; Timothy A. White; Lucas C. Parra

As photon-counting imaging systems become more complex, there is a trend toward measuring more attributes of each individual event. In various imaging systems the attributes can include several position variables, time variables, and energies. If more than about four attributes are measured for each event, it is not practical to record the data in an image matrix. Instead it is more efficient to use a simple list where every attribute is stored for every event. It is the purpose of this paper to discuss the concept of likelihood for such list-mode data. We present expressions for list-mode likelihood with an arbitrary number of attributes per photon and for both preset counts and preset time. Maximization of this likelihood can lead to a practical reconstruction algorithm with list-mode data, but that aspect is covered in a separate paper [IEEE Trans. Med. Imaging (to be published)]. An expression for lesion detectability for list-mode data is also derived and compared with the corresponding expression for conventional binned data.

Collaboration


Dive into the Harrison H. Barrett's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kyle J. Myers

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brian W. Miller

Pacific Northwest National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge