Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andreas Breitbarth is active.

Publication


Featured researches published by Andreas Breitbarth.


Optical Measurement Systems for Industrial Inspection VIII | 2013

High-speed 3D shape measurement using array projection

Stefan Heist; Marcel Sieler; Andreas Breitbarth; Peter Kühmstedt; Gunther Notni

Measuring the three-dimensional (3D) surface shape of objects in real time has become an important task e.g. in industrial quality management or medical sciences. Stereo vision-based arrangements in connection with pattern projection offer high data acquisition speed and low computation time. However, these coded-light techniques are limited by the projection speed which is conventionally in the range of 200. . .250Hz. In this contribution, we present the concepts and a realized setup of a so-called 3D array projector. It is ultra-slim, but nonetheless able to project fixed patterns with high brightness and depth of focus. Furthermore, frame rates up to the 100 kHz range are achievable without any need of mechanically moving parts since the projection speed is limited mainly by the switching frequency of the used LEDs. According to the measurement requirements, type and structure of the patterns can be chosen almost freely: linear or sinusoidal fringes, binary codes such as the Gray code, square, hexagonal or random patterns and many more. First investigations on the functionality of such a 3D array projector were conducted using a prototype with a combination of Gray codes and phase-shifted sinusoidal fringes. Our contribution proves the high brightness of the proposed projector, its sharpness and the good Michelson contrast of the fringe patterns. We deal with the patterns’ homogeneity and the accuracy of the phase shift between the sinusoidal patterns. Furthermore, we present first measurement results and outline future research which is, inter alia, addressed to the use of other structured light techniques with the help of new purpose-built 3D array projector prototypes.


Optical Measurement Systems for Industrial Inspection VII | 2011

Fringe Projection Based High Speed 3D Sensor for Real-Time Measurements

Christian Bräuer-Burchardt; Andreas Breitbarth; Peter Kühmstedt; Ingo Schmidt; Matthias Heinze; Gunther Notni

A sensor based on fringe projection technique was developed which allows ultrafast measurements of the surface of flat measuring objects which realizes a data acquisition rate up to 8.9 million 3D points per second. The high measuring velocity was achieved by consequent fringe code reduction and parallel data processing. Fringe sequence length was reduced using geometric constraints of the sensor arrangement including epipolar geometry. Further reduction of the image sequence length was obtained by omission of the Gray code sequence by using the geometric constraints of the measuring objects. The sensor may be used e.g. for inspection of conductor boards.


Optical Engineering | 2014

High-speed three-dimensional measurements with a fringe projection-based optical sensor

Christian Bräuer-Burchardt; Andreas Breitbarth; Peter Kühmstedt; Gunther Notni

Abstract. An optical three-dimensional (3-D) sensor based on a fringe projection technique that realizes the acquisition of the surface geometry of small objects was developed for highly resolved and ultrafast measurements. It realizes a data acquisition rate up to 60 high-resolution 3-D datasets per second. The high measurement velocity was achieved by consequent fringe code reduction and parallel data processing. The reduction of the length of the fringe image sequence was obtained by omission of the Gray code sequence using the geometric restrictions of the measurement objects and the geometric constraints of the sensor arrangement. The sensor covers three different measurement fields between 20  mm×20  mm and 40  mm×40  mm with a spatial resolution between 10 and 20 μm, respectively. In order to obtain a robust and fast recalibration of the sensor after change of the measurement field, a calibration procedure based on single shot analysis of a special test object was applied which works with low effort and time. The sensor may be used, e.g., for quality inspection of conductor boards or plugs in real-time industrial applications.


british machine vision conference | 2010

Multi-View Planning for Simultaneous Coverage and Accuracy Optimisation.

Christoph Munkelt; Andreas Breitbarth; Gunther Notni; Joachim Denzler

View planning for three-dimensional (3D) reconstruction and inspection solves the problem of finding an efficient sequence of views allowing complete and high quality reconstruction of complex objects. To fulfil this task, view planning methods need to deal with sensor limitations and satisfy predefined goals. Our objective is to jointly evaluate accuracy requirements and coverage during planning, to optimise the reconstruction procedure, as well as to explicitly take configuration space constraints into account. We present a novel model-based approach, which at the same time optimises accuracy and coverage based on an existing model (e.g. CAD or preview scan). For accuracy optimisation we extended the statistical E-criterion to model directional uncertainty. Coverage is maximised while taking configuration space and sensor characteristics into account. We validate our approach through experimental evaluation using the Next-bestview (NBV) benchmark framework and a robot mounted stereo fringe projection sensor.


Proceedings of SPIE | 2015

Phase unwrapping of fringe images for dynamic 3D measurements without additional pattern projection

Andreas Breitbarth; Eric Müller; Peter Kühmstedt; Gunther Notni; Joachim Denzler

Fringe projection is an established method for contactless measurement of 3D object structure. Adversely, the coding of fringe projection is ambiguous. To determine object points with absolute position in 3D space, this coding has to be unique. We propose a novel approach of phase unwrapping without using additional pattern projection. Based on a stereo camera setup, an image segmentation of each view in areas without height jumps larger than a fringe period is necessary. Within these segments, phase unwrapping is potentially without error. Alignment of phase maps between the two views is realized by an identification process of one correspondence point.


O3A: Optics for Arts, Architecture, and Archaeology III | 2011

Hand Held 3D Sensor for Documentation of Fossil and Archaeological Excavations

Peter Kühmstedt; Christian Bräuer-Burchardt; Ingo Schmidt; Matthias Heinze; Andreas Breitbarth; Gunther Notni

A mobile hand held battery powered sensor based on fringe projection technique for preservation of fossil traces and archaeological excavations was developed. It consists of a projector and two cameras and covers a measuring field of about 240 mm x 175 mm x 160 mm. The core time for data acquisition is 0.34 s and the final result of a 3D point cloud is obtained in less than five seconds. Errors due to movements of the sensor are detected and can be swept out. The sensor allows the capturing of 3D data of the observed surface together with colour information. It was successfully applied at fossil find of traces of a dinosaur at rock layers from Triassic. 3D reconstruction of a part of the excavation was realized including the determination of the depth of traces.


international conference on imaging systems and techniques | 2016

A novel 3D multispectral vision system based on filter wheel cameras

Chen Zhang; Maik Rosenberger; Andreas Breitbarth; Gunther Notni

The combination of the multispectral imaging and the 3D imaging in one system would render possible varieties of applications in different fields. In this work a method is proposed for the registration of multispectral data and 3D point cloud, in which the number of spectral bands is increased by combining different multispectral images through 3D information. Based on this method, a 3D multispectral imaging system consisting of two filter wheel cameras and a digital projector is developed. This system can provide maximally 23 spectral bands with a short acquisition duration of ca. 1.5 seconds and achieve a data fusion accuracy of 15 μm for a measurement field of 128 mm × 145 mm. Furthermore, there exists a significant potential to extend the spectral range and raise the spectral resolution by integrating additional cameras with the proposed data fusion principle.


Optical Measurement Systems for Industrial Inspection X | 2017

Verification of real sensor motion for a high-dynamic 3D measurement inspection system

Andreas Breitbarth; Martin Correns; Manuel Zimmermann; Chen Zhang; Maik Rosenberger; Jörg Schambach; Gunther Notni

Inline three-dimensional measurements are a growing part of optical inspection. Considering increasing production capacities and economic aspects, dynamic measurements under motion are inescapable. Using a sequence of different pattern, like it is generally done in fringe projection systems, relative movements of the measurement object with respect to the 3d sensor between the images of one pattern sequence have to be compensated. Based on the application of fully automated optical inspection of circuit boards at an assembly line, the knowledge of the relative speed of movement between the measurement object and the 3d sensor system should be used inside the algorithms of motion compensation. Optimally, this relative speed is constant over the whole measurement process and consists of only one motion direction to avoid sensor vibrations. The quantified evaluation of this two assumptions and the error impact on the 3d accuracy are content of the research project described by this paper. For our experiments we use a glass etalon with non-transparent circles and transmitted light. Focused on the circle borders, this is one of the most reliable methods to determine subpixel positions using a couple of searching rays. The intersection point of all rays characterize the center of each circle. Based on these circle centers determined with a precision of approximately 1=50 pixel, the motion vector between two images could be calculated and compared with the input motion vector. Overall, the results are used to optimize the weight distribution of the 3d sensor head and reduce non-uniformly vibrations. Finally, there exists a dynamic 3d measurement system with an error of motion vectors about 4 micrometer. Based on this outcome, simulations result in a 3d standard deviation at planar object regions of 6 micrometers. The same system yields a 3d standard deviation of 9 µm without the optimization of weight distribution.


Dimensional Optical Metrology and Inspection for Practical Applications VI | 2017

Wavelength dependency of optical 3D measurements at translucent objects using fringe pattern projection

Chen Zhang; Maik Rosenberger; Andreas Breitbarth; Gunther Notni

The requirement for a non-transparent Lambertian like surface in optical 3D measurements with fringe pattern projection cannot be satisfied at translucent objects. The translucency causes artifacts and systematic errors in the pattern decoding, which could lead to measurement errors and a decrease of measurement stability. In this work, the influence of light wavelength on 3D measurements was investigated at a stereoscopic system consisting of two filter wheel cameras with narrowband bandpass filters and a projector with a wide-band light source. The experimental results show a significant wavelength dependency of the systematic measurement deviation and the measurement stability.


Videometrics, Range Imaging, and Applications XII; and Automated Visual Inspection | 2013

Lighting estimation in fringe images during motion compensation for 3D measurements

Andreas Breitbarth; Peter Kühmstedt; Gunther Notni; Joachim Denzler

Fringe projection is an established method to measure the 3D structure of macroscopic objects. To achieve both a high accuracy and robustness a certain number of images with pairwise different projection pattern is required. Over this sequence it is necessary that each 3D object point corresponds to the same image point at every time. This situation is no longer given for measurements under motion. One possibility to solve this problem is to restore the static situation. Therefore, the acquired camera images have to be realigned and secondly, the degree of fringe shift has to be estimated. Furthermore, there exists another variable: change in lighting. The compensation of these variances is a difficult task and could only be realized with several assumptions, but it has to be approximately determined and integrated into the 3D reconstruction process. We propose a method to estimate these lighting changes for each camera pixel with respect to their neighbors at each point in time. The algorithms were validated on simulation data, in particular with rotating measurement objects. For translational motion, lighting changes have no severe effect in our applications. Taken together, without using high-speed hardware our method results in a motion compensated dense 3D point cloud which is eligible for three-dimensional measurement of moving objects or setups with sensor systems in motion.

Collaboration


Dive into the Andreas Breitbarth's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Maik Rosenberger

Technische Universität Ilmenau

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge