Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Frank Lenzen is active.

Publication


Featured researches published by Frank Lenzen.


Time-of-Flight and Depth Imaging | 2013

Technical Foundation and Calibration Methods for Time-of-Flight Cameras

Damien Lefloch; Rahul Nair; Frank Lenzen; Henrik Schäfer; Lee V. Streeter; Michael J. Cree; Reinhard Koch; Andreas Kolb

Current Time-of-Flight approaches mainly incorporate an continuous wave intensity modulation approach. The phase reconstruction is performed using multiple phase images with different phase shifts which is equivalent to sampling the inherent correlation function at different locations. This active imaging approach delivers a very specific set of influences, on the signal processing side as well as on the optical side, which all have an effect on the resulting depth quality. Applying ToF information in real application therefore requires to tackle these effects in terms of specific calibration approaches. This survey gives an overview over the current state of the art in ToF sensor calibration.


Computational Optimization and Applications | 2013

A class of quasi-variational inequalities for adaptive image denoising and decomposition

Frank Lenzen; Florian Becker; Jan Lellmann; Stefania Petra; Christoph Schnörr

We introduce a class of adaptive non-smooth convex variational problems for image denoising in terms of a common data fitting term and a support functional as regularizer. Adaptivity is modeled by a set-valued mapping with closed, compact and convex values, that defines and steers the regularizer depending on the variational solution. This extension gives rise to a class of quasi-variational inequalities. We provide sufficient conditions for the existence of fixed points as solutions, and an algorithm based on solving a sequence of variational problems. Denoising experiments with spatial and spatio-temporal image data and an adaptive total variation regularizer illustrate our approach.


Astronomy and Astrophysics | 2004

Automatic detection of arcs and arclets formed by gravitational lensing

Frank Lenzen; Sabine Schindler; Otmar Scherzer

We present an algorithm developed particularly to detect gravitationally lensed arcs in clusters of galaxies. This algorithm is suited for automated surveys as well as individual arc detections. New methods are used for image smoothing and source detection. The smoothing is performed by so-called anisotropic diffusion, which maintains the shape of the arcs and does not disperse them. The algorithm is much more efficient in detecting arcs than other source finding algorithms and the detection by eye.


Time-of-Flight and Depth Imaging | 2013

A Survey on Time-of-Flight Stereo Fusion

Rahul Nair; Kai Ruhl; Frank Lenzen; Stephan Meister; Henrik Schäfer; Christoph S. Garbe; Martin Eisemann; Marcus A. Magnor; Daniel Kondermann

Due to the demand for depth maps of higher quality than possible with a single depth imaging technique today, there has been an increasing interest in the combination of different depth sensors to produce a “super-camera” that is more than the sum of the individual parts. In this survey paper, we give an overview over methods for the fusion of Time-of-Flight (ToF) and passive stereo data as well as applications of the resulting high quality depth maps. Additionally, we provide a tutorial-based introduction to the principles behind ToF stereo fusion and the evaluation criteria used to benchmark these methods.


international conference on scale space and variational methods in computer vision | 2013

Adaptive Second-Order Total Variation: An Approach Aware of Slope Discontinuities

Frank Lenzen; Florian Becker; Jan Lellmann

Total variation (TV) regularization, originally introduced by Rudin, Osher and Fatemi in the context of image denoising, has become widely used in the field of inverse problems. Two major directions of modifications of the original approach were proposed later on. The first concerns adaptive variants of TV regularization, the second focuses on higher-order TV models. In the present paper, we combine the ideas of both directions by proposing adaptive second-order TV models, including one anisotropic model. Experiments demonstrate that introducing adaptivity results in an improvement of the reconstruction error.


international conference on computer vision | 2012

High accuracy TOF and stereo sensor fusion at interactive rates

Rahul Nair; Frank Lenzen; Stephan Meister; Henrik Schäfer; Christoph S. Garbe; Daniel Kondermann

We propose two new GPU-based sensor fusion approaches for time of flight (TOF) and stereo depth data. Data fidelity measures are defined to deal with the fundamental limitations of both techniques alone. Our algorithms combine TOF and stereo, yielding megapixel depth maps, enabling our approach to be used in a movie production scenario. Our local model works at interactive rates but yields noisier results, whereas our variational technique is more robust at a higher computational cost. The results show an improvement over each individual method with TOF interreflection remaining an open challenge. To encourage quantitative evaluations, a ground truth dataset is made publicly available.


international symposium on visual computing | 2011

Denoising time-of-flight data with adaptive total variation

Frank Lenzen; Henrik Schäfer; Christoph S. Garbe

For denoising depth maps from time-of-flight (ToF) cameras we propose an adaptive total variation based approach of first and second order. This approach allows us to take into account the geometric properties of the depth data, such as edges and slopes. To steer adaptivity we utilize a special kind of structure tensor based on both the amplitude and phase of the recorded ToF signal. A comparison to state-of-the-art denoising methods shows the advantages of our approach.


Journal of Mathematical Imaging and Vision | 2013

Optimality Bounds for a Variational Relaxation of the Image Partitioning Problem

Jan Lellmann; Frank Lenzen; Christoph Schnörr

We consider a variational convex relaxation of a class of optimal partitioning and multiclass labeling problems, which has recently proven quite successful and can be seen as a continuous analogue of Linear Programming (LP) relaxation methods for finite-dimensional problems. While for the latter several optimality bounds are known, to our knowledge no such bounds exist in the infinite-dimensional setting. We provide such a bound by analyzing a probabilistic rounding method, showing that it is possible to obtain an integral solution of the original partitioning problem from a solution of the relaxed problem with an a priori upper bound on the objective. The approach has a natural interpretation as an approximate, multiclass variant of the celebrated coarea formula.


international conference on 3d vision | 2013

Depth and Intensity Based Edge Detection in Time-of-Flight Images

Henrik Schäfer; Frank Lenzen; Christoph S. Garbe

A new approach for edge detection in Time-of-Flight (ToF) depth images is presented. Especially for depth images, accurate edge detection can facilitate many image processing tasks, but rarely any methods for ToF data exist. The proposed algorithm yields highly accurate results through combining edge information both from the intensity and depth image acquired by the imager. The applicability and advantage of the new approach is demonstrated on several recorded scenes and through ToF denoising using adaptive total variation as an application. It is shown that results improve considerably compared to another state-of-the art edge detection algorithm adapted for ToF depth images.


Time-of-Flight and Depth Imaging | 2013

Denoising Strategies for Time-of-Flight Data

Frank Lenzen; Kwang In Kim; Henrik Schäfer; Rahul Nair; Stephan Meister; Florian Becker; Christoph S. Garbe; Christian Theobalt

When considering the task of denoising ToF data, two issues arise concerning the optimal strategy. The first one is the choice of an appropriate denoising method and its adaptation to ToF data, the second one is the issue of the optimal positioning of the denoising step within the processing pipeline between acquisition of raw data of the sensor and the final output of the depth map. Concerning the first issue, several denoising approaches specifically for ToF data have been proposed in literature, and one contribution of this chapter is to provide an overview. To tackle the second issue, we exemplarily focus on two state-of-the-art methods, the bilateral filtering and total variation (TV) denoising and discuss several alternatives of positions in the pipeline, where these methods can be applied. In our experiments, we compare and evaluate the results of each combination of method and position both qualitatively and quantitatively. It turns out, that for TV denoising the optimal position is at the very end of the pipeline. For the bilateral filter, a quantitative comparison shows that applying it to the raw data together with a subsequent median filtering provides a low error to ground truth. Qualitatively, it competes with applying the (cross-)bilateral filter to the depth data. In particular, the optimal position in general depends on the considered method. As a consequence, for any newly introduced denoising technique, finding its optimal position within the pipeline is an open issue.

Collaboration


Dive into the Frank Lenzen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge