Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jochen Penne is active.

Publication


Featured researches published by Jochen Penne.


computer vision and pattern recognition | 2008

3-D gesture-based scene navigation in medical imaging applications using Time-of-Flight cameras

Stefan Soutschek; Jochen Penne; Joachim Hornegger; Johannes Kornhuber

For a lot of applications, and particularly for medical intra-operative applications, the exploration of and navigation through 3-D image data provided by sensors like ToF (time-of-flight) cameras, MUSTOF (multisensor-time-of-flight) endoscopes or CT (computed tomography) [8], requires a user-interface which avoids physical interaction with an input device. Thus, we process a touchless user-interface based on gestures classified by the data provided by a ToF camera. Reasonable and necessary user interactions are described. For those interactions a suitable set of gestures is introduced. A user-interface is then proposed, which interprets the current gesture and performs the assigned functionality. For evaluating the quality of the developed user-interface we considered the aspects of classification rate, real-time applicability, usability, intuitiveness and training time. The results of our evaluation show that our system, which provides a classification rate of 94.3% at a framerate of 11 frames per second, satisfactorily addresses all these quality requirements.


medical image computing and computer assisted intervention | 2009

Time-of-Flight 3-D Endoscopy

Jochen Penne; Kurt Höller; Michael Stürmer; Thomas Schrauder; Armin Schneider; Rainer Engelbrecht; Hubertus Feußner; Bernhard Schmauss; Joachim Hornegger

This paper describes the first accomplishment of the Time-of-Flight (ToF) measurement principle via endoscope optics. The applicability of the approach is verified by in-vitro experiments. Off-the-shelf ToF camera sensors enable the per-pixel, on-chip, real-time, marker-less acquisition of distance information. The transfer of the emerging ToF measurement technique to endoscope optics is the basis for a new generation of ToF rigid or flexible 3-D endoscopes. No modification of the endoscope optic itself is necessary as only an enhancement of illumination unit and image sensors is necessary. The major contribution of this paper is threefold: First, the accomplishment of the ToF measurement principle via endoscope optics; second, the development and validation of a complete calibration and post-processing routine; third, accomplishment of extensive in-vitro experiments. Currently, a depth measurement precision of 0.89 mm at 20 fps with 3072 3-D points is achieved.


Medical Physics | 2008

Time-of-flight sensor for respiratory motion gating

Christian Schaller; Jochen Penne; Joachim Hornegger

In this technical note we present a system that uses time-of-flight (ToF) technology to acquire a real-time multidimensional respiratory signal from a 3D surface reconstruction of the patients chest and abdomen without the use of markers. Using ToF sensors it is feasible to acquire a 3D model in real time with a single sensor. An advantage of ToF sensors is that their high lateral resolution makes it possible to define multiple regions of interest to compute an anatomy-adaptive multi-dimensional respiratory signal. We evaluated the new approach by comparing a ToF based respiratory signal with the signal acquired by a commercially available external respiratory gating system and achieved an average correlation coefficient of 0.88.


Proceedings of SPIE | 2009

Time-of-flight sensor for patient positioning

Christian Schaller; André Adelt; Jochen Penne; Joachim Hornegger

In this paper we present a system that uses Time-of-Flight (ToF) technology to correct the position of a patient in respect to a previously acquired reference surface. A ToF sensor enables the acquisition of a 3-D surface model containing more than 25,000 points using a single sensor in real time. One advantage of this technology is that the high lateral resolution makes it possible to accurately compute translation and rotation of the patient in respect to a reference surface. We are using an Iterative Closest Point (ICP) algorithm to determine the 6 degrees of freedom (DOF) vector. Current results show that for rigid phantoms it is possible to obtain an accuracy of 2.88 mm and 0.28° respectively. Tests with human persons validate the robustness and stability of the proposed system. We achieve a mean registration error of 3.38 mm for human test persons. Potential applications for this system can be found within radiotherapy or multimodal image acquisition with different devices.


ieee international conference on automatic face & gesture recognition | 2008

Robust real-time 3D time-of-flight based gesture navigation

Jochen Penne; Stefan Soutschek; Lukas Fedorowicz; Joachim Hornegger

Contactless human-machine-interfaces (HMIs) are an important issue in various applications where a haptic interaction with an input device is not possible or not appropriate. Newly developed Time-of-Flight cameras provide 3D information of the observed scene in real-time at constant lateral resolutions of thousands of pixels. Additionally, a gray-value image of the observed scene is available. Our work compromises three major contributions: First, the robust and real-time capable segmentation of the hand by incorporating 3D and gray-value information; Second, the reliable classification of the performed static gesture using robust features; Third, the design of an HMI which uses the classified gesture as well as the 3D position of the hand to enable complex and convenient user interactions. The benefit of using a ToF camera is that the 3D information is just not available from classical 2D camera systems and thus only with ToF cameras the three dimensions of freedom which are given for non-haptic interactions can be fully used. Currently, classification rates of 98.2% are achieved user-dependent and 94.3% user-independent for 6 gestures. Tests with untrained persons yielded a good to very good acceptance of the HMI.


medical image computing and computer assisted intervention | 2009

Inverse C-arm Positioning for Interventional Procedures Using Real-Time Body Part Detection

Christian Schaller; Christopher Rohkohl; Jochen Penne; Michael Stürmer; Joachim Hornegger

The automation and speedup of interventional therapy and diagnostic workflows is a crucial issue. One way to improve these work-flows is to accelerate the image acquisition procedures by fully automating the patient setup. This paper describes a system that performs this task without the use of markers or other prior assumptions. It returns metric coordinates of the 3-D body shape in real-time for inverse positioning. This is achieved by the application of an emerging technology, called Time-of-Flight (ToF) sensor. A ToF sensor is a cost-efficient, off-the-shelf camera which provides more than 40,000 3-D points in real-time. The first contribution of this paper is the incorporation of this novel imaging technology (ToF) in interventional imaging. The second contribution is the ability of a C-arm system to position itself with respect to the patient prior to the acquisition. We are using the 3-D surface information of the patient to partition the body into anatomical sections. This is achieved by a fast two-stage classification process. The system computes the ISO-center for each detected region. To verify our system we performed several tests on the ISO-center of the head. Firstly, the reproducibility of the head ISO-center computation was evaluated. We achieved an accuracy of (x: 1.73 +/- 1.11 mm/y: 1.87 +/- 1.31 mm/z: 2.91 +/- 2.62 mm). Secondly, a C-arm head scan of a body phantom was setup. Our system automatically aligned the ISO-center of the head with the C-arm ISO-center. Here we achieved an accuracy of +/- 1 cm, which is within the accuracy of the patient table control.


computer vision and pattern recognition | 2008

Standardization of intensity-values acquired by Time-of-Flight-cameras

Michael Stürmer; Jochen Penne; Joachim Hornegger

The intensity-images captured by time-of-flight (ToF)-cameras are biased in several ways. The values differ significantly, depending on the integration time set within the camera and on the distance of the scene. Whereas the integration time leads to an almost linear scaling of the whole image, the attenuation due to the distance is nonlinear, resulting in higher intensities for objects closer to the camera. The background regions that are farther away contain comparably low values, leading to a bad contrast within the image. Another effect is that some kind of specularity may be observed due to uncommon reflecting conditions at some points within the scene. These three effects lead to intensity images which exhibit significantly different values depending on the integration time of the camera and the distance to the scene, thus making parameterization of processing steps like edge-detection, segmentation, registration and threshold computation a tedious task. Additionally, outliers with exceptionally high values lead to insufficient visualization results and problems in processing. In this work we propose scaling techniques which generate images whose intensities are independent of the integration time of the camera and the measured distance. Furthermore, a simple approach for reducing specularity effects is introduced.


Proceedings of SPIE | 2011

Adaptive bilateral filter for image denoising and its application to in-vitro Time-of-Flight data

Alexander Seitel; Thiago R. Dos Santos; Sven Mersmann; Jochen Penne; Anja Groch; Kwong Yung; Ralf Tetzlaff; Hans-Peter Meinzer; Lena Maier-Hein

Image-guided therapy systems generally require registration of pre-operative planning data with the patients anatomy. One common approach to achieve this is to acquire intra-operative surface data and match it to surfaces extracted from the planning image. Although increasingly popular for surface generation in general, the novel Time-of-Flight (ToF) technology has not yet been applied in this context. This may be attributed to the fact that the ToF range images are subject to considerable noise. The contribution of this study is two-fold. Firstly, we present an adaption of the well-known bilateral filter for denoising ToF range images based on the noise characteristics of the camera. Secondly, we assess the quality of organ surfaces generated from ToF range data with and without bilateral smoothing using corresponding high resolution CT data as ground truth. According to an evaluation on five porcine organs, the root mean squared (RMS) distance between the denoised ToF data points and the reference computed tomography (CT) surfaces ranged from 3.0 mm (lung) to 9.0 mm (kidney). This corresponds to an error-reduction of up to 36% compared to the error of the original ToF surfaces.


2009 Proceedings of 6th International Symposium on Image and Signal Processing and Analysis | 2009

Clinical evaluation of Endorientation: Gravity related rectification for endoscopic images

Kurt Höller; Jochen Penne; Joachim Hornegger; Armin Schneider; Sonja Gillen; Hubertus Feussner; Jasper Jahn; Javier Gutierrez; Thomas Wittenberg

Providing a stable horizon on endoscopic images especially in non-rigid endoscopic surgery (particularly NOTES) is still an open issue. Image rectification can be realized with a tiny MEMS tri-axial inertial sensor that is placed on the tip of an endoscope. By measuring the impact of gravity on each of the three orthogonal axes the rotation angle can be estimated with some calculations out of these three acceleration values. Achievable repetition rate for angle termination has to be above the usual endoscopic video frame rate of 25-30 Hz. The accelerometer frame rate can be set up to 400 Hz. Accuracy has to be less than one degree even within periods of high movement and superposed acceleration. Therefore an intelligent downsampling algorithm has to be found. The image rotation is performed by rotating digitally a capture of the endoscopic analog video signal. Improvements and benefits have been evaluated in a clinical evaluation: For different peritoneoscopic tasks time was taken and instrument position was tracked and recorded.


computer vision and pattern recognition | 2009

Robust real-time 3D modeling of static scenes using solely a Time-of-Flight sensor

Johannes Feulner; Jochen Penne; Eva N. K. Kollorz; Joachim Hornegger

An algorithm is proposed for the 3D modeling of static scenes solely based on the range and intensity data acquired by a time-of-flight camera during an arbitrary movement. No additional scene acquisition devices, like inertia sensor, positioning robots or intensity based cameras are incorporated. The current pose is estimated by maximizing the uncentered correlation coefficient between edges detected in the current and a preceding frame at a minimum frame rate of four fps and an average accuracy of 45 mm. The paper also describes several extensions for robust registration like multiresolution hierarchies and projection Iterative Closest Point algorithm. The basic registration algorithm and its extensions were intensively evaluated against ground truth data to validate the accuracy, robustness and real-time-capability.

Collaboration


Dive into the Jochen Penne's collaboration.

Top Co-Authors

Avatar

Joachim Hornegger

University of Erlangen-Nuremberg

View shared research outputs
Top Co-Authors

Avatar

Christian Schaller

University of Erlangen-Nuremberg

View shared research outputs
Top Co-Authors

Avatar

Kurt Höller

University of Erlangen-Nuremberg

View shared research outputs
Top Co-Authors

Avatar

Alexander Seitel

German Cancer Research Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael Stürmer

University of Erlangen-Nuremberg

View shared research outputs
Top Co-Authors

Avatar

Rainer Engelbrecht

University of Erlangen-Nuremberg

View shared research outputs
Top Co-Authors

Avatar

Stefan Soutschek

University of Erlangen-Nuremberg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ralf Tetzlaff

German Cancer Research Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge