Andrei State
University of North Carolina at Chapel Hill
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andrei State.
international conference on computer graphics and interactive techniques | 1996
Andrei State; Gentaro Hirota; David T. Chen; William F. Garrett; Mark A. Livingston
Accurate registration between real and virtual objects is crucial for augmented reality applications. Existing tracking methods are individually inadequate: magnetic trackers are inaccurate, mechanical trackers are cumbersome, and vision-based trackers are computationally problematic. We present a hybrid tracking method that combines the accuracy of vision-based tracking with the robustness of magnetic tracking without compromising real-time performance or usability. We demonstrate excellent registration in three sample applications. CR
medical image computing and computer assisted intervention | 1998
Henry Fuchs; Mark A. Livingston; Ramesh Raskar; D`nardo Colucci; Kurtis Keller; Andrei State; Jessica R. Crawford; Paul Rademacher; Samuel Drake; Anthony A. Meyer
We present the design and a prototype implementation of a three-dimensional visualization system to assist with laparoscopic surgical procedures. The system uses 3D visualization, depth extraction from laparoscopic images, and six degree-of-freedom head and laparoscope tracking to display a merged real and synthetic image in the surgeon’s video-see-through head-mounted display. We also introduce a custom design for this display. A digital light projector, a camera, and a conventional laparoscope create a prototype 3D laparoscope that can extract depth and video imagery.
international conference on computer graphics and interactive techniques | 1996
Andrei State; Mark A. Livingston; William F. Garrett; Gentaro Hirota; Etta D. Pisano; Henry Fuchs
We present a real-time stereoscopic video-see-through augmented reality (AR) system applied to the medical procedure known as ultrasound-guided needle biopsy of the breast. The AR system was used by a physician during procedures on breast models and during non-invasive examinations of human subjects. The system merges rendered live ultrasound data and geometric elements with stereo images of the patient acquired through head-mounted video cameras and presents these merged images to the physician in a head-mounted display. The physician sees a volume visualization of the ultrasound data directly under the ultrasound probe, properly registered within the patient and with the biopsy needle. Using this system, a physician successfully guided a needle into an artificial tumor within a training phantom of a human breast. We discuss the construction of the AR system and the issues and decisions which led to the system architecture and the design of the video see-through head-mounted display. We designed methods to properly resolve occlusion of the real and synthetic image elements. We developed techniques for realtime volume visualization of timeand position-varying ultrasound data. We devised a hybrid tracking system which achieves improved registration of synthetic and real imagery and we improved on previous techniques for calibration of a magnetic tracker. CR
Medical Image Analysis | 2002
Michael H. Rosenthal; Andrei State; Joohi Lee; Gentaro Hirota; Jeremy D. Ackerman; Kurtis Keller; Etta D. Pisano; Michael R. Jiroutek; Keith E. Muller; Henry Fuchs
We report the results of a randomized, controlled trial to compare the accuracy of standard ultrasound-guided needle biopsy to biopsies performed using a 3D Augmented Reality (AR) guidance system. A board-certified radiologist conducted 50 core biopsies of breast phantoms, with biopsies randomly assigned to one of the methods in blocks of five biopsies each. The raw ultrasound data from each biopsy was recorded. Another board-certified radiologist, blinded to the actual biopsy guidance mechanism, evaluated the ultrasound recordings and determined the distance of the biopsy from the ideal position. A repeated measures analysis of variance indicated that the head-mounted display method led to a statistically significantly smaller mean deviation from the desired target than did the standard display method (2.48 mm for control versus 1.62 mm for augmented reality, p<0.02). This result suggests that AR systems can offer improved accuracy over traditional biopsy guidance methods.
Presence: Teleoperators & Virtual Environments | 1997
Mark A. Livingston; Andrei State
We apply a look-up table technique to calibrate both position and orientation readings from a magnetic tracker for use in virtual environments within a defined working volume. In a test volume of 2.4 cubic meters, the method reduced the trackers average position error by 79% and its average orientation error by 40%. We test the correction table against the trackers performance outdoors (a metal-poor environment) and show that readings taken in our lab and corrected by our method exhibit less error than uncorrected readings taken outdoors. We demonstrate that such reduction in position error visibly improves registration in an augmented reality system, whereas the (lesser) reduction in orientation error does not visibly improve registration. We show that the model we used for the orientation error function was incorrect, preventing our method from achieving better correction of orientation error, We discuss future directions for correction of orientation error.
The Visual Computer | 1999
Arthur D. Gregory; Andrei State; Ming C. Lin; Dinesh Manocha; Mark A. Livingston
We present a new approach for establishing correspondence for morphing between two homeomorphic polyhedral models. The user can specify corresponding feature pairs on the polyhedra with a simple and intuitive interface. Based on these features, our algorithm decomposes the boundary of each polyhedron into the same number of morphing patches. A 2D mapping for each morphing patch is computed in order to merge the topologies of the polyhedra one patch at a time. We create a morph by defining morphing trajectories between the feature pairs and by interpolating them across the merged polyhedron. The user interface provides high-level control, as well as local refinement to improve the morph. The implementation has been applied to several polyhedra composed of thousands of polygons. The system can also handle homeomorphic non-simple polyhedra that are not genuszero (or have holes).
interactive 3d graphics and games | 1997
Marco C. Jacobs; Mark A. Livingston; Andrei State
Managing Latency in Complex Augmented Reality Marco C. Jacobs* Mark A. Livingston Andrei State University of North Carolina at Chapel Hill *Delft University of Technology, the Netherlands {jacobsllivingst [state} @cs.unc.edu Systems Registration (or alignment) of the synthetic imagery with the real world is crucial in augmented reality (AR) systems. The data from user-input devices, tracking devices, and imaging devices need to be registered spatially and temporally with the user’s view of the surroundings. Each device haa an associated delay between its observations of the world and the moment when the AR display presented to the user appeto be afkted by a change in the data. We call the differences in delay the relative Iatencies. Relative latency is a source of misregistration and should be reduced. We give general methods for handling multiple data streams with different latency values associated with them in a working AR system. We measure the latency differences (part of the system dependent set of calibrations), time-stamp on-host, adjust the moment of sampling, and interpolate or extrapolate data streams. By using these schemes, a more accurate and consistent view is computed and presented to the user. CR
VBC '96 Proceedings of the 4th International Conference on Visualization in Biomedical Computing | 1996
Henry Fuchs; Andrei State; Etta D. Pisano; William F. Garrett; Gentaro Hirota; Mark A. Livingston; Stephen M. Pizer
Augmented reality is applied to ultrasound-guided needle biopsy of the human breast. In a tracked stereoscopic head-mounted display, a physician sees the ultrasound imagery “emanating” from the transducer, properly registered with the patient and the biopsy needle. A physician has successfully used the system to guide a needle into a synthetic tumor within a breast phantom and examine a human patient in preparation for a cyst aspiration.
ieee visualization | 1996
William F. Garrett; Henry Fuchs; Andrei State
We present a method for producing real-time volume visualizations of continuously captured, arbitrarily-oriented 2D arrays (slices) of data. Our system constructs a 3D representation on-the-fly from incoming 2D ultrasound slices by modeling and rendering the slices as planar polygons with translucent surface textures. We use binary space partition (BSP) tree data structures to provide non-intersecting, visibility-ordered primitives for accurate opacity accumulation images. New in our system is a method of using parallel, time-shifted BSP trees to efficiently manage the continuously captured ultrasound data and to decrease the variability in image generation time between output frames. This technique is employed in a functioning real-time augmented reality system that a physician has used to examine human patients prior to breast biopsy procedures. We expect the technique can be used for real-time visualization of any 2D data being collected from a tracked sensor moving along an arbitrary path.
Proceedings Computer Animation '98 (Cat. No.98EX169) | 1998
Arthur D. Gregory; Andrei State; Ming C. Lin; Dinesh Manocha; Mark A. Livingston
Presents a new approach for establishing correspondence between two homeomorphic 3D polyhedral models. The user can specify corresponding feature pairs on the polyhedra with a simple and intuitive interface. Based on these features, our algorithm decomposes the boundary of each polyhedron into the same number of morphing patches. A 2D mapping for each morphing patch is computed in order to merge the topologies of the polyhedra one patch at a time. We create a morph by defining morphing trajectories between the feature pairs and by interpolating them across the merged polyhedron. The user interface provides high-level control as well as local refinement to improve the morph. The implementation has been applied to several complex polyhedra composed of thousands of polygons. The system can also handle non-simple polyhedra that have holes.