Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Johann Hummel is active.

Publication


Featured researches published by Johann Hummel.


Physics in Medicine and Biology | 2003

Computer-enhanced stereoscopic vision in a head-mounted operating binocular

Wolfgang Birkfellner; Michael Figl; Christian Matula; Johann Hummel; Rudolf Hanel; Herwig Imhof; Felix Wanschitz; Arne Wagner; Franz Watzinger; Helmar Bergmann

Based on the Varioscope, a commercially available head-mounted operating binocular, we have developed the Varioscope AR, a see through head-mounted display (HMD) for augmented reality visualization that seamlessly fits into the infrastructure of a surgical navigation system. We have assessed the extent to which stereoscopic visualization improves target localization in computer-aided surgery in a phantom study. In order to quantify the depth perception of a user aiming at a given target, we have designed a phantom simulating typical clinical situations in skull base surgery. Sixteen steel spheres were fixed at the base of a bony skull, and several typical craniotomies were applied. After having taken CT scans, the skull was filled with opaque jelly in order to simulate brain tissue. The positions of the spheres were registered using VISIT, a system for computer-aided surgical navigation. Then attempts were made to locate the steel spheres with a bayonet probe through the craniotomies using VISIT and the Varioscope AR as a stereoscopic display device. Localization of targets 4 mm in diameter using stereoscopic vision and additional visual cues indicating target proximity had a success rate (defined as a first-trial hit rate) of 87.5%. Using monoscopic vision and target proximity indication, the success rate was found to be 66.6%. Omission of visual hints on reaching a target yielded a success rate of 79.2% in the stereo case and 56.25% with monoscopic vision. Time requirements for localizing all 16 targets ranged from 7.5 min (stereo, with proximity cues) to 10 min (mono, without proximity cues). Navigation error is primarily governed by the accuracy of registration in the navigation system, whereas the HMD does not appear to influence localization significantly. We conclude that stereo vision is a valuable tool in augmented reality guided interventions.


international symposium on mixed and augmented reality | 2001

Current status of the Varioscope AR, a head-mounted operating microscope for computer-aided surgery

Michael Figl; Wolfgang Birkfellner; Johann Hummel; Rudolf Hanel; Peter Homolka; Franz Watzinger; F. Wanshit; Rolf Ewers; Helmar Bergmann

Computer-aided surgery (CAS), the intraoperative application of biomedical visualization techniques, appears to be one of the most promising fields of application for augmented reality (AR), the display of additional computer generated graphics over a real-world scene. Typically a device such as a head-mounted display (HMD) is used for AR. However considerable technical problems connected with AR have limited the intraoperative application of HMDs up to now. One of the difficulties in using HMDs is the requirement for a common optical focal plane for both the real-world scene and the computer generated image, and acceptance of the HMD by the user in a surgical environment. In order to increase the clinical acceptance of AR, we have adapted the Varioscope (Life Optics, Vienna), a miniature, cost-effective head-mounted operating microscope, for AR. In this work, we present the basic design of the modified HMD, and the method and results of an extensive laboratory study for photogrammetric calibration of the Varioscopes computer displays to a real-world scene. In a series of sixteen calibrations with varying zoom factors and object distances, mean calibration error was found to be 1.24/spl plusmn/0.38 pixels or 0.12/spl plusmn/0.05 mm for a 640/spl times/480 display. Maximum error accounted for 3.33/spl plusmn/1.04 pixels or 0.33/spl plusmn/0.12 mm. The location of a position measurement probe of an optical tracking system was transformed to the display with an error of less than I mm in the real world in 56% of all cases. For the remaining cases, error was below 2 mm. We conclude that the accuracy achieved in our experiments is sufficient for a wide range of CAS applications.


Medical Imaging 2001: Visualization, Display, and Image-Guided Procedures | 2001

Calibration of projection parameters in the varioscope AR, a head-mounted display for augmented-reality visualization in image-guided therapy

Wolfgang Birkfellner; Michael Figl; Klaus Huber; Johann Hummel; Rudolf Hanel; Peter Homolka; Franz Watzinger; Felix Wanschitz; Rolf Ewers; Helmar Bergmann

Computer-aided surgery (CAS), the intraoperative application of biomedical visualization techniques, appears to be one of the most promising fields of application for augmented reality (AR), the display of additional computer generated graphics over a real-world scene. Typically a device such as a head-mounted display (HMD) is used for AR. However, considerable technical problems connected with AR have limited the intraoperative application of HMDs up to now. One of the difficulties in using HMDs is the requirement for a common optical focal plane for both the real-world scene and the computer generated image, and acceptance of the HMD by the user in a surgical environment. In order to increase the clinical acceptance of AR, we have adapted the Varioscope (Life Optics, Vienna), a miniature, cost- effective head-mounted operating microscope, for AR. In this work, we present the basic design of the modified HMD, and the method and results of an extensive laboratory study for photogrammetric calibration of the Varioscopes computer displays to a real-world scene. In a series of sixteen calibrations with varying zoom factors and object distances, mean calibration error was found to be 1.24+/- 0.38 pixels or 0.12+/- 0.05 mm for a 640 x 480 display. Maximum error accounted for 3.33+/- 1.04 pixels or 0.33+/- 0.12 mm. The location of a position measurement probe of an optical tracking system was transformed to the display with an error of less than 1 mm in the real world in 56% of all cases. For the remaining cases, error was below 2 mm. We conclude that the accuracy achieved in our experiments is sufficient for a wide range of CAS applications.


Medical Imaging 2004: Visualization, Image-Guided Procedures, and Display | 2004

Automatic calibration of an optical see-through head-mounted display for augmented reality applications in computer assisted interventions

Michael Figl; Christopher Ede; Wolfgang Birkfellner; Johann Hummel; Rudolf Seemann; Helmar Bergmann

We are developing an optical see through head mounted display in which preoperative planning data provided nby a computer aided surgery system is overlaid to the optical image of the patient. nIn order to cope with head movements of the surgeon the device has to be calibrated for a wide zoom and nfocus range. For such a calibration accurate and robust localization of a huge amount of calibration points is nof utmost importance. Because of the negligible radial distortion of the optics in our device, we were able to nuse projective invariants for stable detection of the calibration fiducials on a planar grid. The pattern at the nplanar grid was designed using a different cross ratio for four consecutive points in x respectively y direction. nFor automated image processing we put a CCD camera behind the eye piece of the device. The resulting image nwas thresholded and segmented, after deleting the artefacts a Sobel edge detector was applied and the image nwas Hough transformed to detect the x and y axes. Then the world coordinates of fiducial points on the grid ncould be detected. nA series of six camera calibrations with two zoom settings was done. The mean values of the errors for the ntwo calibrations were 0.08 mm respectively 0.3 mm.


Medical Imaging 2003: Visualization, Image-Guided Procedures, and Display | 2003

Navigation System for flexible Endoscopes

Johann Hummel; Michael Figl; Wolfgang Birkfellner; Michael Häfner; Christian Kollmann; Helmar Bergmann

Endoscopic Ultrasound (EUS) features flexible endoscopes equipped with a radial or linear array scanhead allowing high resolution examination of organs adjacent to the upper gastrointestinal tract. An optical system based on fibre-glass or a CCD-chip allows additional orientation. However, 3-dimensional orientation and correct identification of the various anatomical structures may be difficult. It therefore seems desirable to merge real-time US images with high resolution CT or MR images acquired prior to EUS to simplify navigation during the intervention. The additional information provided by CT or MR images might facilitate diagnosis of tumors and, ultimately, guided puncture of suspicious lesions. We built a grid with 15 plastic spheres and measured their positions relatively to five fiducial markers placed on the top of the grid. For this measurement we used an optical tracking system (OTS) (Polaris, NDI, Can). Two sensors of an electromagnetic tracking system (EMTS) (Aurora, NDI, Can) were mounted on a flexible endoscope (Pentax GG 38 UX, USA) to enable a free hand ultrasound calibration. To determine the position of the plastic spheres in the emitter coordinate system of the EMTS we applied a point-to-point registration (Horn) using the coordinates of the fiducial markers in both coordinate systems (OTS and EMTS). For the transformation between EMTS to the CT space the Horn algorithm was adopted again using the fiducial markers. Visualization was enabled by the use of the AVW-4.0 library (Biomedical Imaging Resource, Mayo Clinic, Rochester/MN, USA). To evaluate the suitability of our new navigation system we measured the Fiducial Registration Error (FRE) of the diverse registrations and the Target Registration Error (TRE) for the complete transformation from the US space to the CT space. The FRE for the ultrasound calibration amounted to 4.3 mm ± 4.2 mm, resulting from 10 calibration procedures. For the transformation from the OTS reference system to the EMTS emitter space we found an average FRE of 0.8 mm ± 0.2 mm. The FRE for the CT registration was 1.0 mm ± 0.3 mm. The TRE was found to be 3.8 mm ± 1.3 mm if we target the same spheres which where used for the calibration procedure. A movement of the phantom results in higher TREs because of the orientation sensitivity of the sensor. In that case the TRE in the area where the biopsy is supposed to be taken place was found to be 7.9 mm ± 3.2 mm. Our system provides the interventionist with additional information about position and orientation of the used flexible instrument. Additionally, it improves the marksmanship of biopsies. The use of the miniaturized EMTS enables for the first time the navigation of flexible instruments in this way. For the successful application of navigation systems in interventional radiology, an accuracy in the range of 5 mm is desirable. The accuracy of the localization of a point in CT space are just 3 mm too high as required. One of the possibilities to overcome this difference is to mount the two sensors in such a way that the interference of their electromagnetic fields is minimized. A considerable restraint constitutes the small characteristic volume (360mm x 600mm x 600mm), which requires for most application an additional optical system.


Medical Imaging 2003: Visualization, Image-Guided Procedures, and Display | 2003

Calibration of an optical see-through head-mounted display with variable zoom and focus for applications in computer-assisted interventions

Michael Figl; Wolfgang Birkfellner; Johann Hummel; Christopher Ede; Rudolf Hanel; Helmar Bergmann

During the last few years head mounted displays (HMD) became more important in Computer assisted surgery (CAS). Rapid head movements of the surgeon enforce to change the focal plane and the zoom value without loosing the calibration. Starting from previous work in developing an optical see through head mounted display we adapted our HMD to measure the focal and zoom values. This made it possible to extend the calibration to different zoom and focus values. The case of the HMD was opened to gain access to the zoom lenses, which was necessary to measure the different zoom values. Focusing in our HMD is realized by changing the angle between the two tubes. Therefore we marked two points at the tubes to measure the focal adjustment. We made a series of planar calibrations with seven different fixed zoom and focus values using Tsai´s algorithm for camera calibration. Then we used the Polaris optical tracking system (Northern Digital, Ontario, Can) to measure the transformation from the planar calibration grid to a tracker probe rigidly mounted to the HMD. The calibration parameters transformed to this tracker probe are independent of the actual position of the calibration grid andare the parameters we want to approximate. Then least square approximating polynomial surfaces were derived for the seven calibration parameters. The coefficients of the polynomial surfaces were used as starting values for a nonlinear optimization procedure minimizing an overall error. Minimizing the object space error (which is the minimal distance of the line through the center of projection and the image point to the real world point) in the last step of the procedure described above we had a mean object space error 0.85 ±0.5 mm. Calibration of the HMD is not lost during minor changes in zoom and focus. This is likely to be the first optical see through HMD developed for CAS with variable zoom and focus, which are typically facilities of operating microscopes. Employing an automated calibration in common with more zoom and focus steps and more accurate measurement of the position of the zoom lenses and the focal plane should reduce the error significantly, enabling the accuracy needed for CAS.


Medical Imaging 2002: Visualization, Image-Guided Procedures, and Display | 2002

Laboratory assessment of a miniature electromagnetic tracker

Johann Hummel; Wolfgang Birkfellner; Michael Figl; C. Haider; Rudolf Hanel; Helmar Bergmann

With the invention of miniaturized electromagnetic digitizers comes a variety of potential clinical applications for computer aided interventions using flexible instruments; it has become possible to track endoscopes or catheters within the body. To evaluate the reliability of a new commercial tracking system, we measured the systematic distortions induced by various materials such as closed metallic loops, wire guides, catheters and ultrasound scan heads. The system under evaluation was the electromagnetic tracking system Aurora (Mednetix/CH, NDI/Can); data were acquired using the serial port of a PC running SuSE Linux 7.1 (SuSE, Gmbh, Nuernberg). The objects suspected to cause distortions were brought into the digitizer volume. Beside this, we evaluated the influence of a C-arm fluoroscopy unit. To quantify the reliability of the system, the miniaturized sensor was mounted on a nonmetallic measurement rack while the transmitter was fixed at three different distances within the digitizer range. The tracker is more sensitive to distortions caused by materials close to the emitter (average value 13.6 mm +/- 16.6mm) for wire loops positioned at a distance between 100 mm and 200 mm from the emitter). Distortions caused by materials near the sensor (distances smaller than 100 mm) are small (typical error: 2.2 mm +/- 1.9 mm) in comparison to the errors of other electromagnetic systems published in an earlier study of our group where we found an average error of 3.4 mm. Considerable distortions are caused by the C-arm fluoroscopy unit and limits the reliability of the tracker (error: 18.6 mm +/- 24.9 mm). The US scan head was found to cause significant distortions only at a distance between the emitter and the scan head less than 100 mm from the emitter in contrast to the average error of 3.8 mm +/- 6.3 mm at distances greater than 100 mm. Taking into account that significant distortions only occur in the presence of metallic objects close to the emitter, these results indicate the opportunities which are now available in surgical applications where flexible instruments may need to be monitored within the patient.


Medical Imaging 2002: Visualization, Image-Guided Procedures, and Display | 2002

PC-based control unit for a head-mounted operating microscope for augmented-reality visualization in surgical navigation

Michael Figl; Wolfgang Birkfellner; Franz Watzinger; Felix Wanschitz; Johann Hummel; Rudolf Hanel; Rolf Ewers; Helmar Bergmann

Two main concepts of Head Mounted Displays (HMD) for augmented reality (AR) visualization exist, the optical and video-see through type. Several research groups have pursued both approaches for utilizing HMDs for computer aided surgery. While the hardware requirements for a video see through HMD to achieve acceptable time delay and frame rate seem to be enormous the clinical acceptance of such a device is doubtful from a practical point of view. Starting from previous work in displaying additional computer-generated graphics in operating microscopes, we have adapted a miniature head mounted operating microscope for AR by integrating two very small computer displays. To calibrate the projection parameters of this so called Varioscope AR we have used Tsais Algorithm for camera calibration. Connection to a surgical navigation system was performed by defining an open interface to the control unit of the Varioscope AR. The control unit consists of a standard PC with a dual head graphics adapter to render and display the desired augmentation of the scene. We connected this control unit to a computer aided surgery (CAS) system by the TCP/IP interface. In this paper we present the control unit for the HMD and its software design. We tested two different optical tracking systems, the Flashpoint (Image Guided Technologies, Boulder, CO), which provided about 10 frames per second, and the Polaris (Northern Digital, Ontario, Canada) which provided at least 30 frames per second, both with a time delay of one frame.


Medical Imaging 2002: Visualization, Image-Guided Procedures, and Display | 2002

Stereoscopic visualization in the varioscope AR: a see-through head-mounted display for surgical navigation

Wolfgang Birkfellner; Michael Figl; Christian Matula; Johann Hummel; Rudolf Hanel; Herwig Imhof; Felix Wanschitz; Arne Wagner; Franz Watzinger; Helmar Bergmann

Based on the Varioscope, a commercially available head mounted operating binocular, we have developed a head mounted display for augmented reality visualization that seamlessly fits into the infrastructure of a surgical navigation system. This head mounted display, called the Varioscope AR, is equipped with two miniature computer monitors that merge computer graphics with the view of the operating field as seen by the surgeon. Since the position of the Varioscope AR is being tracked by the navigation systems optical tracker, planning data such as the location of a lesion identified on preoperative volume images can be displayed in the correct position transparently overlaying the optical field of view. In order to assess the systems accuracy and the depth perception of a user aiming at a given target, we have designed a phantom for skull base surgery. 16 steel spheres were fixed at the base of a bony skull, and several typical craniotomies were applied. After having taken CT scans, the skull was filled with opaque gelatine in order to simulate brain tissue. The positions of the spheres were registered using VISIT, a system for computer aided surgical navigation. Then attempts were made to locate the steel spheres with a bayonet probe through the craniotomies using VISIT and the Varioscope AR as a stereoscopic display device. Localization using stereoscopic vision with this novel device had a success rate (defined as a first trial hit rate) of 81,5%. Using monoscopic vision, the success rate was found to be 50%. We conclude that the Varioscope AR is now mature for further cadaver tests and clinical studies.


Medical Imaging 2006: Visualization, Image-Guided Procedures, and Display | 2006

Endoscopic navigation system using 2D/3D registration

Johann Hummel; Michael Figl; Helmar Bergmann; Wolfgang Birkfellner

The paper describes a computer-aided navigation system using image fusion to support endoscopic intervention like accurate collection of biopsy specimen. In particular, an endoscope which provides the physician with real time ultrasound (US) and a video image, is equipped with an electromagnetic tracking sensor. An image slice that corresponds to the actual image of the US scan head is derived from a preoperative computed tomography (CT) volume data set by means of oblique reformatting. Both views are displayed side by side. The position of the image acquired by the US scanhead is determined by the miniatured electromagnetic tracking system (EMTS) after applying a calibration to the endoscopes scanhead. The relative orientation between the patient coordinate system and a preoperative dataset (such CT or magnetic resonance (MR) image) is derived from a 2D/3D registration. This was achieved by calibrating an interventional CT slice by means of an optical tracking system (OTS) using the same algorithm as for the US calibration. Then the interventional CT slice is used for a 2D/3D registration into the coordinate system of the preoperative CT. The fiducial registration error (FRE) for the US calibration amounted to 3.6 mm +/- 2.0 mm. For the interventional CT we found a FRE of 0.36 +/- 0.12 mm. The error for the 2D/3D registration was 2.3 +/- 0.5 mm. The point-to-point registration between to OTS and the EMTS was accomplished with an FRE of 0.6 mm.

Collaboration


Dive into the Johann Hummel's collaboration.

Top Co-Authors

Avatar

Wolfgang Birkfellner

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael Figl

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Franz Watzinger

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rolf Ewers

Vienna General Hospital

View shared research outputs
Top Co-Authors

Avatar

Arne Wagner

Medical University of Vienna

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge