Matthias O. Franz
University of Konstanz
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Matthias O. Franz.
Classical and Quantum Gravity | 2014
Deborah Aguilera; Holger Ahlers; Baptiste Battelier; Ahmad Bawamia; Andrea Bertoldi; R. Bondarescu; K. Bongs; Philippe Bouyer; Claus Braxmaier; L. Cacciapuoti; C. P. Chaloner; M Chwalla; W. Ertmer; Matthias O. Franz; Naceur Gaaloul; M. Gehler; D. Gerardi; L Gesa; Norman Gürlebeck; Jonas Hartwig; Matthias Hauth; Ortwin Hellmig; Waldemar Herr; Sven Herrmann; Astrid Heske; Andrew Hinton; P. Ireland; Philippe Jetzer; Ulrich Johann; Markus Krutzik
The theory of general relativity describes macroscopic phenomena driven by the influence of gravity while quantum mechanics brilliantly accounts for microscopic effects. Despite their tremendous individual success, a complete unification of fundamental interactions is missing and remains one of thexa0most challenging and important quests in modern theoretical physics. The spacetime explorer and quantum equivalence principle space test satellite mission, proposed as a medium-size mission within the Cosmic Vision program of the European Space Agency (ESA), aims for testing general relativity with high precision in two experiments by performing a measurement of the gravitational redshift of the Sun and the Moon by comparing terrestrial clocks, and by performing a test of the universality of free fall of matter waves in the gravitational field of Earth comparing the trajectory of two Bose–Einstein condensates of 85Rb and 87Rb. The two ultracold atom clouds are monitored very precisely thanks to techniques of atom interferometry. This allows to reach down to an uncertainty in the Eotvos parameter of at least 2 × 10−15. In this paper, we report about the results of the phase A mission study of the atom interferometer instrument covering the description of the main payload elements, the atomic source concept, and the systematic error sources.
Experimental Astronomy | 2015
Thilo Schuldt; Christian Schubert; Markus Krutzik; Lluis Gesa Bote; Naceur Gaaloul; Jonas Hartwig; Holger Ahlers; Waldemar Herr; Katerine Posso-Trujillo; Jan Rudolph; Stephan Seidel; Thijs Wendrich; W. Ertmer; Sven Herrmann; André Kubelka-Lange; Alexander Milke; Benny Rievers; E. Rocco; Andrew Hinton; K. Bongs; Markus Oswald; Matthias O. Franz; Matthias Hauth; Achim Peters; Ahmad Bawamia; Andreas Wicht; Baptiste Battelier; Andrea Bertoldi; Philippe Bouyer; Arnaud Landragin
Atom interferometers have a multitude of proposed applications in space including precise measurements of the Earth’s gravitational field, in navigation & ranging, and in fundamental physics such as tests of the weak equivalence principle (WEP) and gravitational wave detection. While atom interferometers are realized routinely in ground-based laboratories, current efforts aim at the development of a space compatible design optimized with respect to dimensions, weight, power consumption, mechanical robustness and radiation hardness. In this paper, we present a design of a high-sensitivity differential dual species 85Rb/87Rb atom interferometer for space, including physics package, laser system, electronics and software. The physics package comprises the atom source consisting of dispensers and a 2D magneto-optical trap (MOT), the science chamber with a 3D-MOT, a magnetic trap based on an atom chip and an optical dipole trap (ODT) used for Bose-Einstein condensate (BEC) creation and interferometry, the detection unit, the vacuum system for 10−11xa0mbar ultra-high vacuum generation, and the high-suppression factor magnetic shielding as well as the thermal control system. The laser system is based on a hybrid approach using fiber-based telecom components and high-power laser diode technology and includes all laser sources for 2D-MOT, 3D-MOT, ODT, interferometry and detection. Manipulation and switching of the laser beams is carried out on an optical bench using Zerodur bonding technology. The instrument consists of 9 units with an overall mass of 221xa0kg, an average power consumption of 608xa0W (814xa0W peak), and a volume of 470 liters which would well fit on a satellite to be launched with a Soyuz rocket, as system studies have shown.
international conference on curves and surfaces | 2014
Manuel Caputo; Klaus Denker; Matthias O. Franz; Pascal Laube; Georg Umlauf
Classification of point clouds by different types of geometric primitives is an essential part in the reconstruction process of CAD geometry. We use support vector machines (SVM) to label patches in point clouds with the class labels tori, ellipsoids, spheres, cones, cylinders or planes. For the classification features based on different geometric properties like point normals, angles, and principal curvatures are used. These geometric features are estimated in the local neighborhood of a point of the point cloud. Computing these geometric features for a random subset of the point cloud yields a feature distribution. Different features are combined for achieving best classification results. To minimize the time consuming training phase of SVMs, the geometric features are first evaluated using linear discriminant analysis (LDA).
Classical and Quantum Gravity | 2014
Deborah Aguilera; Holger Ahlers; Baptiste Battelier; Ahmad Bawamia; Andrea Bertoldi; R. Bondarescu; K. Bongs; Philippe Bouyer; Claus Braxmaier; L. Cacciapuoti; C. P. Chaloner; M Chwalla; W. Ertmer; Matthias O. Franz; Naceur Gaaloul; M. Gehler; D. Gerardi; L. Gesa; Norman Gürlebeck; Jonas Hartwig; Matthias Hauth; Ortwin Hellmig; Waldemar Herr; Sven Herrmann; Astrid Heske; Andrew Hinton; P. Ireland; Philippe Jetzer; Ulrich Johann; Markus Krutzik
The theory of general relativity describes macroscopic phenomena driven by the influence of gravity while quantum mechanics brilliantly accounts for microscopic effects. Despite their tremendous individual success, a complete unification of fundamental interactions is missing and remains one of the most challenging and important quests in modern theoretical physics. The spacetime explorer and quantum equivalence principle space test satellite mission, proposed as a medium-size mission within the Cosmic Vision program of the European Space Agency (ESA), aims for testing general relativity with high precision in two experiments by performing a measurement of the gravitational redshift of the Sun and the Moon by comparing terrestrial clocks, and by performing a test of the universality of free fall of matter waves in the gravitational field of Earth comparing the trajectory of two Bose–Einstein condensates of 85Rb and 87Rb. The two ultracold atom clouds are monitored very precisely thanks to techniques of atom interferometry. This allows to reach down to an uncertainty in the Eötvös parameter of at least 2 × 10−15. In this paper, we report about the results of the phase A mission study of the atom interferometer instrument covering the description of the main payload elements, the atomic source concept, and the systematic error sources.
international conference on machine vision | 2017
Pascal Laube; Matthias O. Franz; Georg Umlauf
In the reverse engineering process one has to classify parts of point clouds with the correct type of geometric primitive. Features based on different geometric properties like point relations, normals, and curvature information can be used, to train classifiers like Support Vector Machines (SVM). These geometric features are estimated in the local neighborhood of a point of the point cloud. The multitude of different features makes an in-depth comparison necessary. In this work we evaluate 23 features for the classification of geometric primitives in point clouds. Their performance is evaluated on SVMs when used to classify geometric primitives in simulated and real laser scanned point clouds. We also introduce a normalization of point cloud density to improve classification generalization.
european frequency and time forum | 2014
Thilo Schuldt; Johannes Stühler; Klaus Döringshoff; J. Pahl; Evgeny V. Kovalchuk; Matthias O. Franz; Ulrich Johann; Achim Peters; Claus Braxmaier
We present the development of compact and ruggedized iodine-based frequency references on elegant breadboard (EBB) and engineering model (EM) level using modulation transfer spectroscopy near 532 nm. A frequency stabilty of 1·10-14 at an integration time of 1 s and below 5·10-15 at integration times between 10s and 100s was achieved. Space applications of such an optical frequency reference can be found in fundamental physics, geoscience, Earth observation, navigation and ranging. One example is the proposed mSTAR (mini SpaceTime Asymmetry Research) mission, dedicated to perform a Kennedy-Thorndike experiment on a satellite in a low-Earth orbit. By comparing an iodine standard to a cavity-based frequency reference and integration over 2 year mission lifetime, the Kennedy-Thorndike coefficient will be determined with up to two orders of magnitude higher accuracy than the current best ground experiment.
Computer Aided Geometric Design | 2018
Pascal Laube; Matthias O. Franz; Georg Umlauf
Abstract Knot placement for curve approximation is a well known and yet open problem in geometric modeling. Selecting knot values that yield good approximations is a challenging task, based largely on heuristics and user experience. More advanced approaches range from parametric averaging to genetic algorithms. In this paper, we propose to use Support Vector Machines (SVMs) to determine suitable knot vectors for B-spline curve approximation. The SVMs are trained to identify locations in a sequential point cloud where knot placement will improve the approximation error. After the training phase, the SVM can assign, to each point set location, a so-called score. This score is based on geometric and differential geometric features of points. It measures the quality of each location to be used as knots in the subsequent approximation. From these scores, the final knot vector can be constructed exploring the topography of the score-vector without the need for iteration or optimization in the approximation process. Knot vectors computed with our approach outperform state of the art methods and yield tighter approximations.
Optics and Photonics for Information Processing XI | 2017
Matthias O. Franz; Michael Grunwald; Martin Schall; Pascal Laube; Georg Umlauf
Digital cameras are used in a large variety of scientific and industrial applications. For most applications, the acquired data should represent the real light intensity per pixel as accurately as possible. However, digital cameras are subject to physical, electronic and optical effects that lead to errors and noise in the raw image. Temperature- dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels are examples of such effects. The purpose of radiometric calibration is to improve the quality of the resulting images by reducing the influence of the various types of errors on the measured data and thus improving the quality of the overall application. In this context, we present a specialized neural network architecture for radiometric calibration of digital cameras. Neural networks are used to learn a temperature- and exposure-dependent mapping from observed gray-scale values to true light intensities for each pixel. In contrast to classical at-fielding, neural networks have the potential to model nonlinear mappings which allows for accurately capturing the temperature dependence of the dark current and for modeling cameras with nonlinear sensitivities. Both scenarios are highly relevant in industrial applications. The experimental comparison of our network approach to classical at-fielding shows a consistently higher reconstruction quality, also for linear cameras. In addition, the calibration is faster than previous machine learning approaches based on Gaussian processes.
document analysis systems | 2016
Martin Schall; Marc-Peter Schambach; Matthias O. Franz
Offline handwriting recognition systems often include a decoding step, that is retrieving the most likely character sequence from the underlying machine learning algorithm. Decoding is sensitive to ranges of weakly predicted characters, caused e.g. by obstructions in the scanned document. We present a new algorithm for robust decoding of handwriting recognizer outputs using character n-grams. Multidimensional hierarchical subsampling artificial neural networks with Long-Short-Term-Memory cells have been successfully applied to offline handwriting recognition. Output activations from such networks, trained with Connectionist Temporal Classification, can be decoded with several different algorithms in order to retrieve the most likely literal string that it represents. We present a new algorithm for decoding the network output while restricting the possible strings to a large lexicon. The index used for this work is an n-gram index with tri-grams used for experimental comparisons. N-grams are extracted from the network output using a backtracking algorithm and each n-gram assigned a mean probability. The decoding result is obtained by intersecting the n-gram hit lists while calculating the total probability for each matched lexicon entry. We conclude with an experimental comparison of different decoding algorithms on a large lexicon.
Proceedings of SPIE | 2015
Martin Schall; Michael Grunwald; Georg Umlauf; Matthias O. Franz
Digital cameras are subject to physical, electronic and optic effects that result in errors and noise in the image. These effects include for example a temperature dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels. The task of a radiometric calibration is to reduce these errors in the image and thus improve the quality of the overall application. In this work we present an algorithm for radiometric calibration based on Gaussian processes. Gaussian processes are a regression method widely used in machine learning that is particularly useful in our context. Then Gaussian process regression is used to learn a temperature and exposure time dependent mapping from observed gray-scale values to true light intensities for each pixel. Regression models based on the characteristics of single pixels suffer from excessively high runtime and thus are unsuitable for many practical applications. In contrast, a single regression model for an entire image with high spatial resolution leads to a low quality radiometric calibration, which also limits its practical use. The proposed algorithm is predicated on a partitioning of the pixels such that each pixel partition can be represented by one single regression model without quality loss. Partitioning is done by extracting features from the characteristic of each pixel and using them for lexicographic sorting. Splitting the sorted data into partitions with equal size yields the final partitions, each of which is represented by the partition centers. An individual Gaussian process regression and model selection is done for each partition. Calibration is performed by interpolating the gray-scale value of each pixel with the regression model of the respective partition. The experimental comparison of the proposed approach to classical flat field calibration shows a consistently higher reconstruction quality for the same overall number of calibration frames.