Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeremy D. Ackerman is active.

Publication


Featured researches published by Jeremy D. Ackerman.


Medical Image Analysis | 2002

Augmented reality guidance for needle biopsies: an initial randomized, controlled trial in phantoms.

Michael H. Rosenthal; Andrei State; Joohi Lee; Gentaro Hirota; Jeremy D. Ackerman; Kurtis Keller; Etta D. Pisano; Michael R. Jiroutek; Keith E. Muller; Henry Fuchs

We report the results of a randomized, controlled trial to compare the accuracy of standard ultrasound-guided needle biopsy to biopsies performed using a 3D Augmented Reality (AR) guidance system. A board-certified radiologist conducted 50 core biopsies of breast phantoms, with biopsies randomly assigned to one of the methods in blocks of five biopsies each. The raw ultrasound data from each biopsy was recorded. Another board-certified radiologist, blinded to the actual biopsy guidance mechanism, evaluated the ultrasound recordings and determined the distance of the biopsy from the ideal position. A repeated measures analysis of variance indicated that the head-mounted display method led to a statistically significantly smaller mean deviation from the desired target than did the standard display method (2.48 mm for control versus 1.62 mm for augmented reality, p<0.02). This result suggests that AR systems can offer improved accuracy over traditional biopsy guidance methods.


medical image computing and computer assisted intervention | 2001

Augmented Reality Guidance for Needle Biopsies: A Randomized, Controlled Trial in Phantoms

Michael H. Rosenthal; Andrei State; Joohi Lee; Gentaro Hirota; Jeremy D. Ackerman; Kurtis Keller; Etta D. Pisano; Michael R. Jiroutek; Keith E. Muller; Henry Fuchs

We report the results of a randomized, controlled tnal to compare the accuracy of standard ultrasound-guided needle biopsy to biopsies performed using a 3D Augmented Reality (AR) guidance system. Fifty core biopsies of breast phantoms were conducted by a board-certified radiologist, with each set of five biopsies randomly assigned to one of the methods. The raw ultrasound data from each biopsy was recorded. Another board-certified radiologist, blinded to the actual biopsy guidance mechanism, evaluated the ultrasound recordings and determined the distance of the biopsy from the ideal position. A repeated measures analysis of variance indicated that the head-mounted display method led to a statistically significantly smaller mean deviation from the desired target than did the CRT display method. (2.48mm for control versus 1.62mm for augmented reality, p < 0.02). This result suggests that AR systems can offer improved accuracy over traditional biopsy guidance methods.


international symposium on mixed and augmented reality | 2001

Dynamic virtual convergence for video see-through head-mounted displays: maintaining maximum stereo overlap throughout a close-range work space

Andrei State; Jeremy D. Ackerman; Gentaro Hirota; Joohi Lee; Henry Fuchs

We present a technique that allows users of video see-through head-mounted displays to work at close range without the typical loss of stereo perception due to reduced nasal side stereo overlap in most of todays commercial HMDs. Our technique dynamically selects parts of the imaging frustums acquired by wide-angle head-mounted cameras and re-projects them for the narrower field-of-view displays. In addition to dynamically maintaining maximum stereo overlap for objects at a heuristically estimated working distance, it also reduces the accommodation-vergence conflict, at the expense of a newly introduced disparity-vergence conflict. We describe the hardware (assembled from commercial components) and software implementation of our system and report on our experience while using this technique within two different AR applications. (Plus color plates).


electronic imaging | 2000

Real-time structured light depth extraction

Kurtis Keller; Jeremy D. Ackerman

Gathering depth data using structured light has been a procedure for many different environments and uses. Many of these system are utilized instead of laser line scanning because of their quickness. However, to utilize depth extraction for some applications, in our case laparoscopic surgery, the depth extraction must be in real time. We have developed an apparatus that speeds up the raw image display and grabbing in structured light depth extraction from 30 frames per second to 60 and 180 frames per second. This results in an updated depth and texture map of about 15 times per second versus about 3. This increased update rate allows for real time depth extraction for use in augmented medical/surgical applications. Our miniature, fist-sized projector utilizes an internal ferro-reflective LCD display that is illuminated with cold light from a flex light pipe. The miniature projector, attachable to a laparoscope, displays inverted pairs of structured light into the body where these images are then viewed by a high-speed camera set slightly off axis from the projector that grabs images synchronously. The images from the camera are ported to a graphics-processing card where six frames are worked on simultaneously to extract depth and create mapped textures from these images. This information is then sent to the host computer with 3D coordinate information of the projector/camera and the associated textures. The surgeon is then able to view body images in real time from different locations without physically moving the laparoscope imager/projector, thereby, reducing the trauma of moving laparoscopes in the patient.


electronic imaging | 2002

Surface reconstruction of abdominal organs using laparoscopic structured light for augmented reality

Jeremy D. Ackerman; Kurtis Keller; Henry Fuchs

Creation of accurate surface models of abdominal organs is essential for many developing technologies in medicine and surgery. One application we are working towards is augmented reality (AR) visualization for laparoscopic surgery. Our current system meets some, but not all, of the requirements. We use two custom built laparoscopes, a custom built miniature projector, a standard camera, and a standard video capture and processing card to implement a laparoscopic structured light range acquisition system. We will briefly show the custom hardware but will emphasize the structured light depth extraction techniques used for the unique properties of surfaces inside the body, particularly dealing with specular reflections. In early experiments, we studied the effectiveness of our algorithm in highly specular environments by creating range images acquired from fresh animal organs. These experiments used a large projector, open abdomens, and offline image processing. We report the results of experiments using our miniature projector, and on line processing.


medicine meets virtual reality | 2003

Stereo imagery from the UNC augmented reality system for breast biopsy guidance.

Andrei State; Kurtis Keller; Michael H. Rosenthal; Hua Yang; Jeremy D. Ackerman; Henry Fuchs

This paper shows a number of stereoscopic images depicting the UNC augmented reality guidance system for medical visualization in operation.


medicine meets virtual reality | 2001

Real-time anatomical 3D image extraction for laparoscopic surgery.

Jeremy D. Ackerman; Kurtis Keller; Henry Fuchs

Progress in the application of augmented reality to laparoscopic surgery has been limited by the difficulty associated with generating geometric information about the current patient in real time. Structured light techniques are well known methods for generating range images using a camera and projector, but typically fail when faced with biological specimens. We describe techniques and equipment that have shown promise for acquisition of range images for use in a real-time augmented reality system for laparoscopic surgery.


Proceedings of SPIE - The International Society for Optical Engineering | 2002

Switched pattern laser projection for real-time depth extraction and visualization through endoscopes

Kurtis Keller; Jeremy D. Ackerman; Henry Fuchs

Gathering depth information through an endoscope or laparoscope during surgical or other procedures is quite difficult. There are stereo laparoscopes but generating three-dimensional models with them is very difficult. Accurate real-time generation of three-dimensional models through a laparoscope is a needed technology to enable a wide range of surgical applications. We have designed a miniature laparoscopic optical system consisting of a single laser whose pattern is modulated and uses the laparoscope as the optical display path into the body. Two cameras, one sensitive to the laser light and the other for full color imaging share this same tube as the laser projector but use the light from the opposite direction. The images gathered by the laser sensitive camera are used to generate a three dimensional map, and the color image is used to acquire the corresponding texture map. High-speed image processing hardware is used to generate 3D information using a structured light technique. The user can then re-render the acquired scene in 3D. The optical system is divided into a removable upper half consisting of the cameras, laser, digital light switches and combining optics. The lower half is the laparoscope or endocope that can be sterilized. There can be several variations in the configuration of the laparoscope optical half that tailor to different procedures.


Archive | 1999

Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction

Kurtis Keller; Jeremy D. Ackerman; Michael H. Rosenthal; Henry Fuchs; Andrei State


Archive | 2007

System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities

Sharif Razzaque; Kurtis Keller; Andrei State; Caroline Green; Jeremy D. Ackerman

Collaboration


Dive into the Jeremy D. Ackerman's collaboration.

Top Co-Authors

Avatar

Kurtis Keller

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Henry Fuchs

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Andrei State

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gentaro Hirota

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Joohi Lee

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Etta D. Pisano

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael R. Jiroutek

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Hua Yang

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge