Elmar Zander
Braunschweig University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Elmar Zander.
Archive | 2012
Mike Espig; Wolfgang Hackbusch; Alexander Litvinenko; Hermann G. Matthies; Elmar Zander
In this article we introduce new methods for the analysis of high dimensional data in tensor formats, where the underling data come from the stochastic elliptic boundary value problem. After discretisation of the deterministic operator as well as the presented random fields via KLE and PCE, the obtained high dimensional operator can be approximated via sums of elementary tensors. This tensors representation can be effectively used for computing different values of interest, such as maximum norm, level sets and cumulative distribution function. The basic concept of the data analysis in high dimensions is discussed on tensors represented in the canonical format, however the approach can be easily used in other tensor formats. As an intermediate step we describe efficient iterative algorithms for computing the characteristic and sign functions as well as pointwise inverse in the canonical tensor format. Since during majority of algebraic operations as well as during iteration steps the representation rank grows up, we use lower-rank approximation and inexact recursive iteration schemes.
10th Working Conference on Uncertainty Quantification in Scientific Computing (WoCoUQ) | 2011
Hermann G. Matthies; Alexander Litvinenko; Oliver Pajonk; Bojana V. Rosić; Elmar Zander
Computational uncertainty quantification in a probabilistic setting is a special case of a parametric problem. Parameter dependent state vectors lead via association to a linear operator to analogues of covariance, its spectral decomposition, and the associated Karhunen-Loeve expansion. From this one obtains a generalised tensor representation The parameter in question may be a tuple of numbers, a function, a stochastic process, or a random tensor field. The tensor factorisation may be cascaded, leading to tensors of higher degree. When carried on a discretised level, such factorisations in the form of low-rank approximations lead to very sparse representations of the high dimensional quantities involved. Updating of uncertainty for new information is an important part of uncertainty quantification. Formulated in terms or random variables instead of measures, the Bayesian update is a projection and allows the use of the tensor factorisations also in this case.
Archive | 2011
Hermann G. Matthies; Elmar Zander
Computational approaches to systems involving random fields or stochastic processes have to discretise these fields or processes. This produces– when compared to the deterministic case – many variables in the computation, resulting in a very high-dimensional problem. Based on the conviction that the essential stochastic properties of the system are close to some – albeit unknown – lower dimensional manifold, one may try to approximate the response of the system by a data-sparse representation.
arXiv: Probability | 2016
Hermann G. Matthies; Elmar Zander; Bojana V. Rosić; Alexander Litvinenko; Oliver Pajonk
In a Bayesian setting, inverse problems and uncertainty quantification (UQ) --- the propagation of uncertainty through a computational (forward) model --- are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.
arXiv: Probability | 2016
Hermann G. Matthies; Elmar Zander; Bojana V. Rosić; Alexander Litvinenko
When a mathematical or computational model is used to analyse some system, it is usual that some parameters resp. functions or fields in the model are not known, and hence uncertain. These parametric quantities are then identified by actual observations of the response of the real system. In a probabilistic setting, Bayes’s theory is the proper mathematical background for this identification process. The possibility of being able to compute a conditional expectation turns out to be crucial for this purpose. We show how this theoretical background can be used in an actual numerical procedure, and shortly discuss various numerical approximations.
Archive | 2018
Pradeep Kumar; Noemi Friedman; Elmar Zander; Rolf Radespiel
A mathematical tool developed for calibrating model parameters of VRANS equations for modeling flows through porous medium is evaluated. A total of six parameters are introduced in a volume averaged RANS model to appropriately scale the impact of porous media on the overall flow. The calibration tool has been tested for a generic channel case and the results are compared with DNS simulations of the same. The results show a good agreement between the parameters obtained from the tool and a manual calibration documented previously.
Linear Algebra and its Applications | 2012
Hermann G. Matthies; Elmar Zander
Pamm | 2007
Elmar Zander; Hermann G. Matthies
The annual research report | 2014
Martin Eigel; Claude Jeffrey Gittelson; Christoph Schwab; Elmar Zander
Research report / Seminar für Angewandte Mathematik | 2013
Martin Eigel; Claude Jeffrey Gittelson; Christoph Schwab; Elmar Zander