Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bernd Kummer is active.

Publication


Featured researches published by Bernd Kummer.


Archive | 1992

Newton’s Method Based on Generalized Derivatives for Nonsmooth Functions: Convergence Analysis

Bernd Kummer

This paper presents sufficient and necessary conditions for the convergence of Newton’s method based on generalized derivatives. These conditions require uniform injectivity of the derivatives as well as uniform high-order approximation of the original locally Lipschitz function along rays through the solution. Our approach permits to determine approximate solutions of the Newton subproblems and to use such concepts of derivatives for nonsmooth functions, multivalued or not, as directional and B-derivatives, contingent derivatives, generalized Jacobians and others. Furthermore, we ensure solvability of the subproblems via surjecivi-ty of the derivatives and verify a Kantorovich-type convergence theorem.


Journal of Optimization Theory and Applications | 1991

Lipschitzian inverse functions, directional derivatives, and applications in C 1,1 optimization

Bernd Kummer

The paper shows that Thibaults limit sets allow an iff-characterization of local Lipschitzian invertibility in finite dimension. We consider these sets as directional derivatives and extend the calculus in a way that can be used to clarify whether critical points are strongly stable inC1,1 optimization problems.


Mathematical Programming | 2008

Optimization methods and stability of inclusions in Banach spaces

Diethard Klatte; Bernd Kummer

Our paper deals with the interrelation of optimization methods and Lipschitz stability of multifunctions in arbitrary Banach spaces. Roughly speaking, we show that linear convergence of several first order methods and Lipschitz stability mean the same. Particularly, we characterize calmness and the Aubin property by uniformly (with respect to certain starting points) linear convergence of descent methods and approximate projection methods. So we obtain, e.g., solution methods (for solving equations or variational problems) which require calmness only. The relations of these methods to several known basic algorithms are discussed, and errors in the subroutines as well as deformations of the given mappings are permitted. We also recall how such deformations are related to standard algorithms like barrier, penalty or regularization methods in optimization.


Journal of Mathematical Analysis and Applications | 1991

An implicit-function theorem for C0, 1-equations and parametric C1, 1-optimization

Bernd Kummer

Abstract The implicit-function theorem deals with the solutions of the equation F ( x , t ) = a for locally Lipschitz functions F from R n + m into R n . The existence of a locally well-defined and Lipschitzian solution function x = G ( a , t ) will be completely characterized in terms of certain multivalued directional derivatives of F which determine the corresponding derivatives of G in a simple way. Our directional derivatives are nothing but L. Thibaults ( Ann. Mat. Pura Appl. (4) 125 , 1980, 157–192) limit sets which have been introduced to extend Clarkes calculus to functions in abstract spaces. For parametric C 1, 1 -optimization problems, we study the critical point map, the associated critical values, and derive first and second order formulas, respectively.


Archive | 1985

Stability Properties of Infima and Optimal Solutions of Parametric Optimization Problems

Diethard Klatte; Bernd Kummer

In the analysis of parametric optimization problems it is of great interest to explore certain stability properties of the optimal value function and of the optimal set mapping (or some selection function of this mapping): continuity, smoothness, directional differentiability, Lipschitz continuity and the like. For a survey of this field we refer to comprehensive treatments of various aspects of such questions in the recent works of Fiacco (1983), Bank et al. (1982) and Rockafellar (1982).


Optimization | 1999

Metric regularity: characterizations, nonsmooth variations and successive approximation ∗

Bernd Kummer

Metric regularity of (multi-) functions is characterized in terms of some uniform lower semicontinuity as well as by means of Ekeland points of related functionals. Specializations and consequences are studied for stability conditions via co-derivatives and contingent derivatives. Under metric regularity, we show that persistence and Lipschitzian behavior of parametric solutions can be constructively handled by a successive approximation scheme. This permits a simple approach to implicit functions with multi valued inverse and connects iteration methods of quite different types. Particularly, successive approximation may be trivially applied for solving regular piecewise smooth equations


Computational Optimization and Applications | 1999

Generalized Kojima–Functions and Lipschitz Stability of Critical Points

Diethard Klatte; Bernd Kummer

In this paper we consider systems of equations which are defined by nonsmooth functions of a special structure. Functions of this type are adapted from Kojimas form of the Karush–Kuhn–Tucker conditions for C2—optimization problems. We shall show that such systems often represent conditions for critical points of variational problems (nonlinear programs, complementarity problems, generalized equations, equilibrium problems and others). Our main purpose is to point out how different concepts of generalized derivatives lead to characterizations of different Lipschitz properties of the critical point or the stationary solution set maps.


Siam Journal on Optimization | 2002

Constrained Minima and Lipschitzian Penalties in Metric Spaces

Diethard Klatte; Bernd Kummer

It is well known that a local minimizer of a constrained optimization problem with Lipschitzian objective is a free local minimizer of an assigned penalty function if the constraints satisfy an appropriate regularity condition. We use an upper Lipschitz property (L1) as regularity concept and present locally Lipschitz penalty functions defined on the whole space for arbitrary constraint maps of this type. We give conditions under which the maximum of the penalties of finitely many multifunctions is a valid penalty function for the intersection of these multifunctions. Further, the same statements will be derived under other regularity assumptions, namely, for calm or pseudo-Lipschitz constraints which violate (L1), by showing that some submapping of a calm map always has property (L1) and possesses (locally) the same penalties. In this way our penalizations induce in a unified manner, via known properties of free local minimizers for Lipschitz functions only, primal and dual necessary conditions for these basic notions of regularity.


Siam Journal on Optimization | 2005

Strong Lipschitz Stability of Stationary Solutions for Nonlinear Programs and Variational Inequalities

Diethard Klatte; Bernd Kummer

The stationary solution map X of a canonically perturbed nonlinear program or variational condition is studied. We primarily aim at characterizing X to be locally single-valued and Lipschitz near some stationary point x0 of an initial problem, where our focus is on characterizations which are explicitly given in terms of the original functions and assigned quadratic problems. Since such criteria are closely related to a nonsingularity property of the strict graphical derivative of X, explicit formulas for this derivative are presented, too. It turns out that even for convex polynomial problems our stability (and the Aubin property) does not depend only on the derivatives, up to some fixed order, of the problem functions at x0. This is in contrast to various other stability concepts. Further, we clarify completely the relations to Kojimas strong stability and present simplifications for linearly and certain linearly quadratically constrained problems, convex programs, and for the map of global minimizers as well.


Optimization | 1977

Global stability of optimization problems

Bernd Kummer

This paper presents necessary and sufficient conditions for the stability of the problem . Here M is a subset of a metric space X, λ is an element of some set ⋀ “with convergence” and f is a functional defined on the Cartesian product X×⋀. These conditions apply to the upper and lower semiconformity of the function and the upper semiconformity of the point-to-set mapping . The used set-convergence is less strong than the convergence induced by the Hausdorff metric. As conclusions theorems on the relationship between the f 0 and [fcirc] upper semiconformity and sufficient stability-conditions for some general problems (especially quasi convex programming) are received. The necessity of certain suppositions is illustrated by appropriate examples.

Collaboration


Dive into the Bernd Kummer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christian Grossmann

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christian Hess

Paris Dauphine University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lionel Thibault

University of Montpellier

View shared research outputs
Top Co-Authors

Avatar

Gerald Beer

California State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge