Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Johannes Lotz is active.

Publication


Featured researches published by Johannes Lotz.


ACM Transactions on Mathematical Software | 2015

Algorithmic Differentiation of Numerical Methods: Tangent and Adjoint Solvers for Parameterized Systems of Nonlinear Equations

Uwe Naumann; Johannes Lotz; Klaus Leppkes; Markus Towara

We discuss software tool support for the algorithmic differentiation (AD), also known as automatic differentiation, of numerical simulation programs that contain calls to solvers for parameterized systems of n nonlinear equations. The local computational overhead and the additional memory requirement for the computation of directional derivatives or adjoints of the solution of the nonlinear system with respect to the parameters can quickly become prohibitive for large values of n. Both are reduced drastically by analytical (and symbolic) approaches to differentiation of the underlying numerical methods. Following the discussion of the proposed terminology, we develop the algorithmic formalism building on prior work by other colleagues and present an implementation based on the AD software dco/c++. A representative case study supports the theoretically obtained computational complexity results with practical runtime measurements.


international conference on conceptual structures | 2013

Algorithmic Differentiation of a Complex C++ Code with Underlying Libraries

Max Sagebaum; Nicolas R. Gauger; Uwe Naumann; Johannes Lotz; Klaus Leppkes

Algorithmic differentiation (AD) is a mathematical concept which evolved over the last decades to a very robust and well understood tool for computation of derivatives. It can be applied to mathematical algorithms, codes for numerical simulation, and whenever derivatives are needed. In this paper we report on the algorithmic differentiation of the discontinuous Galerkin solver padge, a large and complex code written in C++ with underlying external libraries. The reports on successful application of AD to large scale codes are rare in literature and up to now this is not state of the art. Most of the codes, which are differentiated nowadays, are written in C or Fortran. The padge code was differentiated with the operator overloading tool dco/c++ in forward as well as reverse mode. The differentiated code is validated and runs in the expected time margins of AD.


Archive | 2012

Hierarchical Algorithmic Differentiation A Case Study

Johannes Lotz; Uwe Naumann; Jörn Ungermann

This case study in Algorithmic Differentiation (AD) discusses the semi-automatic generation of an adjoint simulation code in the context of an inverse atmospheric remote sensing problem. In-depth structural and performance analyses allow for the run time factor between the adjoint generated by overloading in C++ and the original forward simulation to be reduced to 3. 5. The dense Jacobian matrix of the underlying problem is computed at the same cost. This is achieved by a hierarchical AD using adjoint mode locally for preaccumulation and by exploiting interface contraction. For the given application this approach yields a speed-up over black-box tangent-linear and adjoint mode of more than 170. Furthermore, the memory consumption is reduced by a factor of 1,000 compared to applying black-box adjoint mode.


international conference on conceptual structures | 2015

Second-order Tangent Solvers for Systems of Parameterized Nonlinear Equations

Niloofar Safiran; Johannes Lotz; Uwe Naumann

Abstract Forward mode algorithmic differentiation transforms implementations of multivariate vector functions as computer programs into first directional derivative (also: first-order tangent) code. Its reapplication yields higher directional derivative (higher-order tangent) code. Second derivatives play an important role in nonlinear programming. For example, second-order (Newton-type) nonlinear optimization methods promise faster convergence in the neighborhood of the minimum through taking into account second derivative information. Part of the objective function may be given implicitly as the solution of a system of n parameterized nonlinear equations. If the system parameters depend on the free variables of the objective, then second derivatives of the nonlinear systems solution with respect to those parameters are required. The local computational overhead for the computation of second-order tangents of the solution vector with respect to the parameters by Algorithmic Differentiation depends on the number of iterations performed by the nonlinear solver. This dependence can be eliminated by taking a second-order symbolic approach to differentiation of the nonlinear system.


international conference on parallel processing | 2013

Discrete adjoints of PETSc through dco/c++ and adjoint MPI

Johannes Lotz; Uwe Naumann; Max Sagebaum; Michel Schanen

PETScs [1] robustness, scalability and portability makes it the foundation of various parallel implementations of numerical simulation codes. We formulate a least squares problem using a PETSc implementation as the model function and rely on adjoint mode Algorithmic Differentiation (AD) [2] for the accumulation of the derivative information. Various AD tools exist that apply the adjoint model to a given C/C++ code, while none is able to differentiate MPI [3] enabled code. We solved this by combining dco/c++ and the Adjoint MPI library, leading to a fully discrete adjoint implementation of PETSc. We want to underline that this work differs from accumulating derivative information through AD for PETSc algorithms (see e.g. [4]). We compute derivative information of PETSc itself opening up the possibility of an enclosing optimization problem (as needed, e.g., by [5]).


international conference on conceptual structures | 2016

A Case Study in Adjoint Sensitivity Analysis of Parameter Calibration

Johannes Lotz; Marc Schwalbach; Uwe Naumann

Adjoint sensitivity computation of parameter estimation problems is a widely used technique in the field of computational science and engineering for retrieving derivatives of a cost functional with respect to parameters efficiently. Those derivatives can be used, e.g. for sensitivity analysis, optimization, or robustness analysis. Deriving and implementing adjoint code is an error-prone, non-trivial task which can be avoided by using Algorithmic Differentiation (AD) software. Generating adjoint code by AD software has the downside of usually requiring a huge amount of memory as well as a non-optimal run time. In this article, we couple two approaches for achieving both, a robust and efficient adjoint code: symbolically derived adjoint formulations and AD. Comparisons are carried out for a real-world case study originating from the remote atmospheric sensing simulation software JURASSIC developed at the Institute of Energy and Climate Research Stratosphere, Research Center Jlich. We show, that the coupled approach outperforms the fully algorithmic approach by AD in terms of run time and memory requirement and argue that this can be achieved while still preserving the desireable feature of AD being automatic.


international conference on conceptual structures | 2016

Algorithmic Differentiation of Numerical Methods

Niloofar Safiran; Johannes Lotz; Uwe Naumann

Adjoint mode algorithmic (also know as automatic) differentiation (AD) transforms implementations of multivariate vector functions as computer programs into first-order adjoint code. Its reapplication or combinations with tangent mode AD yields higher-order adjoint code. Second derivatives play an important role in nonlinear programming. For example, second-order (Newton-type) nonlinear optimization methods promise faster convergence in the neighborhood of the minimum through taking into account second derivative information. The adjoint mode is of particular interest in large-scale gradient-based nonlinear optimization due to the independence of its computational cost on the number of free variables. Part of the objective function may be given implicitly as the solution of a system of n parameterized nonlinear equations. If the system parameters depend on the free variables of the objective, then second derivatives of the nonlinear systems solution with respect to those parameters are required. The local computational overhead as well as the additional memory requirement for the computation of second-order adjoints of the solution vector with respect to the parameters by AD depends on the number of iterations performed by the nonlinear solver. This dependence can be eliminated by taking a symbolic approach to the differentiation of the nonlinear system.


international conference on conceptual structures | 2015

Higher-order Discrete Adjoint ODE Solver in C++ for Dynamic Optimization

Johannes Lotz; Uwe Naumann; Ralf Hannemann-Taḿas; T. Ploch; Alexander Mitsos


Atmospheric Measurement Techniques Discussions | 2011

A 3-D tomographic trajectory retrieval for the air-borne limb-imager GLORIA

Jörn Ungermann; J. Blank; Johannes Lotz; Klaus Leppkes; T. Guggenmoser; Martin Kaufmann; Peter Preusse; Uwe Naumann; Martin Riese


2nd International BioSC Symposium "Towards an integrated bioeconomy" | 2017

Incremental multi-scale and multi-disciplinary modeling of processes in bioeconomy

Wolfgang Wiechert; T. Ploch; Uwe Naumann; Xiao Zhao; Alexander Mitsos; Stephan Noack; Ralf Hannemann-Tamás; J. Hüser; Eric von Lieres; Johannes Lotz

Collaboration


Dive into the Johannes Lotz's collaboration.

Top Co-Authors

Avatar

Uwe Naumann

RWTH Aachen University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

T. Ploch

RWTH Aachen University

View shared research outputs
Top Co-Authors

Avatar

Eric von Lieres

Forschungszentrum Jülich

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiao Zhao

Forschungszentrum Jülich

View shared research outputs
Top Co-Authors

Avatar

Max Sagebaum

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar

Nicolas R. Gauger

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge