William F. Garrett
University of North Carolina at Chapel Hill
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by William F. Garrett.
international conference on computer graphics and interactive techniques | 1996
Andrei State; Gentaro Hirota; David T. Chen; William F. Garrett; Mark A. Livingston
Accurate registration between real and virtual objects is crucial for augmented reality applications. Existing tracking methods are individually inadequate: magnetic trackers are inaccurate, mechanical trackers are cumbersome, and vision-based trackers are computationally problematic. We present a hybrid tracking method that combines the accuracy of vision-based tracking with the robustness of magnetic tracking without compromising real-time performance or usability. We demonstrate excellent registration in three sample applications. CR
international conference on computer graphics and interactive techniques | 1996
Andrei State; Mark A. Livingston; William F. Garrett; Gentaro Hirota; Etta D. Pisano; Henry Fuchs
We present a real-time stereoscopic video-see-through augmented reality (AR) system applied to the medical procedure known as ultrasound-guided needle biopsy of the breast. The AR system was used by a physician during procedures on breast models and during non-invasive examinations of human subjects. The system merges rendered live ultrasound data and geometric elements with stereo images of the patient acquired through head-mounted video cameras and presents these merged images to the physician in a head-mounted display. The physician sees a volume visualization of the ultrasound data directly under the ultrasound probe, properly registered within the patient and with the biopsy needle. Using this system, a physician successfully guided a needle into an artificial tumor within a training phantom of a human breast. We discuss the construction of the AR system and the issues and decisions which led to the system architecture and the design of the video see-through head-mounted display. We designed methods to properly resolve occlusion of the real and synthetic image elements. We developed techniques for realtime volume visualization of timeand position-varying ultrasound data. We devised a hybrid tracking system which achieves improved registration of synthetic and real imagery and we improved on previous techniques for calibration of a magnetic tracker. CR
VBC '96 Proceedings of the 4th International Conference on Visualization in Biomedical Computing | 1996
Henry Fuchs; Andrei State; Etta D. Pisano; William F. Garrett; Gentaro Hirota; Mark A. Livingston; Stephen M. Pizer
Augmented reality is applied to ultrasound-guided needle biopsy of the human breast. In a tracked stereoscopic head-mounted display, a physician sees the ultrasound imagery “emanating” from the transducer, properly registered with the patient and the biopsy needle. A physician has successfully used the system to guide a needle into a synthetic tumor within a breast phantom and examine a human patient in preparation for a cyst aspiration.
ieee visualization | 1996
William F. Garrett; Henry Fuchs; Andrei State
We present a method for producing real-time volume visualizations of continuously captured, arbitrarily-oriented 2D arrays (slices) of data. Our system constructs a 3D representation on-the-fly from incoming 2D ultrasound slices by modeling and rendering the slices as planar polygons with translucent surface textures. We use binary space partition (BSP) tree data structures to provide non-intersecting, visibility-ordered primitives for accurate opacity accumulation images. New in our system is a method of using parallel, time-shifted BSP trees to efficiently manage the continuously captured ultrasound data and to decrease the variability in image generation time between output frames. This technique is employed in a functioning real-time augmented reality system that a physician has used to examine human patients prior to breast biopsy procedures. We expect the technique can be used for real-time visualization of any 2D data being collected from a tracked sensor moving along an arbitrary path.
Journal of Digital Imaging | 1997
Etta D. Pisano; Jayanthi Chandramouli; Bradley M. Hemminger; Deb Glueck; R. Eugene Johnston; Keith E. Muller; M. Patricia Braeuning; Derek T. Puff; William F. Garrett; Stephen M. Pizer
The purpose of this study was to determine whether intensity windowing (IW) improves detection of simulated masses in dense mammograms. Simulated masses were embedded in dense mammograms digitized at 50 microns/pixel, 12 bits deep. Images were printed with no windowing applied and with nine window width and level combinations applied. A simulated mass was embedded in a realistic background of dense breast tissue, with the position of the mass (against the background) varied. The key variables involved in each trial included the position of the mass, the contrast levels and the IW setting applied to the image. Combining the 10 image processing conditions, 4 contrast levels, and 4 quadrant positions gave 160 combinations. The trials were constructed by pairing 160 combinations of key variables with 160 backgrounds. The entire experiment consisted of 800 trials. Twenty observers were asked to detect the quadrant of the image into which the mass was located. There was a statistically significant improvement in detection performance for masses when the window width was set at 1024 with a level of 3328. IW should be tested in the clinic to determine whether mass detection performance in real mammograms is improved.
Computers & Graphics | 1999
Subodh Kumar; Dinesh Manocha; William F. Garrett; Ming C. Lin
Abstract We present a sub-linear algorithm to compute the set of back-facing polygons in a polyhedral model. The algorithm partitions the model into hierarchical clusters based on the orientations and positions of the polygons. As a pre-processing step, the algorithm constructs spatial decompositions with respect to each cluster. For a sequence of back-face computations, the algorithm exploits the coherence in view-point movement to efficiently determine whether it is in front of or behind a cluster. Due to coherence, the algorithms expected running time is linear in the number of clusters on average. We have applied this algorithm to speed up the rendering of polyhedral models. On average, we are able to cull about 40% of the polygons. The algorithm accounts for 5% of the total CPU time per frame on an SGI Onyx. The overall frame rate is improved by 40–75% as compared to the standard back-face culling implemented in hardware. We also present an extension of our back-face computation algorithm to determine silhouettes of polygonal models. Our technique finds true perspective silhouettes by collecting edges at the common boundaries of back-facing and front-facing clusters.
Breast disease | 1998
Etta D. Pisano; Henry Fuchs; Andrei State; Mark A. Livingston; Gentaro Hirota; William F. Garrett
Etta D. Pisano , Henry Fuchs, Andrei State, Mark A. Livingston, Gentaro Hirota, William F. Garrett and Mary C. Whitton a Departments of Radiology, The University of North Carolina { Chapel Hill School of Medicine and College of Arts and Sciences, Chapel Hill, NC, USA b Computers Science, The University of North Carolina { Chapel Hill School of Medicine and College of Arts and Sciences, Chapel Hill, NC, USA
Studies in health technology and informatics | 1996
Henry Fuchs; Andrei State; Mark A. Livingston; William F. Garrett; Gentaro Hirota; Etta D. Pisano
symposium on computational geometry | 1997
Subodh Kumar; Dinesh Manocha; William F. Garrett; Ming C. Lin
Academic Radiology | 1996
Etta D. Pisano; Bradley M. Hemminger; Jayanthi Chandramouli; William F. Garrett; R. E. Johnston; Deb Glueck; Keith E. Muller; M.P. Braeuning; Derek T. Puff; Stephen M. Pizer