Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Vladimir F. Demyanov is active.

Publication


Featured researches published by Vladimir F. Demyanov.


Optimization | 1983

On quasidifferentiable mappings

Vladimir F. Demyanov; A.M. Rubinov

In nondifferentiable optimization an important role is performed by a sub-differential — a set of linear functionals — which, in one sense or another, locally approximate a given function. For a convex function the subdifferential enables one to describe necessary conditions for a minimum, to compute directional derivative, to find steepest descent directions. Thats why in any attempts have been made to extend the concept of subdifferential to nonconvex nonsmooth functions. It has been shown by the authors that for a rather large family of functions it is useful and natural to consider as an approximating tool not a single set (the subdifferential), but a pair of sets (the quasidifferential). The family of quasidifferential functions is a linear space closed with respect to all algebraic operations as well as the operations of taking pointwise maximum and minimum. In this paper a short survey of some properties of quasidifferentiable functions is presented. The notion of quasidifferentiable mappings is i...


Journal of Global Optimization | 2002

A method of truncated codifferential with application to some problems of cluster analysis

Vladimir F. Demyanov; Adil M. Bagirov; Alexander M. Rubinov

A method of truncated codifferential descent for minimizing continuously codifferentiable functions is suggested. The convergence of the method is studied. Results of numerical experiments are presented. Application of the suggested method for the solution of some problems of cluster analysis are discussed. In numerical experiments Wisconsin Diagnostic Breast Cancer database was used.


Optimization Methods & Software | 1998

Exact penalization via dini and hadamard conditional derivatives

Vladimir F. Demyanov; G. Di Pillo; Francisco Facchinei

Exact penalty functions for nonsmooth constrained optimization problems are analyzed tfy using the notion of (Dini) Hadamard directional derivative with respect to the constraint set. Weak conditions are given guaranteeing equivalence of the sets of stationary, global minimum, local minimum points of the constrained problem and of the penalty function


Optimization | 2006

Optimality conditions in terms of upper and lower exhausters

Vladimir F. Demyanov; Vera A. RosHchina

The notions of exhaustive families of upper convex and lower concave approximations (in the sense of Pschenichnyi) were introduced by Rubinov. For some classes of nonsmooth functions, these tools appeared to be very productive and constructive (e.g., in the case of quasidifferentiable functions). Dual tools – the upper exhauster and the lower exhauster – can be employed to describe optimality conditions and to find directions of the steepest ascent and descent. If a proper exhauster is known (for minimality conditions we need an upper exhauster, while for maximality ones a lower exhauster is required), the above problems are reduced to the problems of finding the nearest points to convex sets. If we study, e.g., the minimization problem and a lower exhauster is available, it is required to convert it into an upper one. In the present article it is shown how to use a lower exhauster to get conditions for a minimum without converting the lower exhauster into an upper one.


Optimization | 2008

Exhausters and subdifferentials in non-smooth analysis

Vladimir F. Demyanov; Vera Roshchina

Non-smooth analysis manifested itself in the 1960s of the last century and is still gaining momentum developing new tools and harnesses and covering new areas of application. One of the notions of non-smooth analysis is that of the exhauster. The exhauster represents a dual construction in Nonsmooth Analysis. The relationships between upper and lower exhausters and various subdifferentials of non-smooth functions are discussed in this article. It is shown that exhausters are closely related to other non-smooth tools, such as the Michel–Penot, Clarke, Gâteaux and Fréchet subdifferentials. Formulae for computing these subdifferentials by means of exhausters are obtained. The discovered relations are all in the form of equalities, i.e. a calculus for computing the mentioned subdifferentials by means of exhausters is provided.


Journal of Global Optimization | 2013

Proper and adjoint exhausters in nonsmooth analysis: optimality conditions

M. E. Abbasov; Vladimir F. Demyanov

The notions of upper and lower exhausters represent generalizations of the notions of exhaustive families of upper convex and lower concave approximations (u.c.a., l.c.a.). The notions of u.c.a.’s and l.c.a.’s were introduced by Pshenichnyi (Convex Analysis and Extremal Problems, Series in Nonlinear Analysis and its Applications, 1980), while the notions of exhaustive families of u.c.a.’s and l.c.a.’s were described by Demyanov and Rubinov in Nonsmooth Problems of Optimization Theory and Control, Leningrad University Press, Leningrad, 1982. These notions allow one to solve the problem of optimization of an arbitrary function by means of Convex Analysis thus essentially extending the area of application of Convex Analysis. In terms of exhausters it is possible to describe extremality conditions, and it turns out that conditions for a minimum are expressed via an upper exhauster while conditions for a maximum are formulated in terms of a lower exhauster (Abbasov and Demyanov (2010), Demyanov and Roshchina (Appl Comput Math 4(2): 114–124, 2005), Demyanov and Roshchina (2007), Demyanov and Roshchina (Optimization 55(5–6): 525–540, 2006)). This is why an upper exhauster is called a proper exhauster for minimization problems while a lower exhauster is called a proper one for maximization problems. The results obtained provide a simple geometric interpretation and allow one to construct steepest descent and ascent directions. Until recently, the problem of expressing extremality conditions in terms of adjoint exhausters remained open. Demyanov and Roshchina (Appl Comput Math 4(2): 114–124, 2005), Demyanov and Roshchina (Optimization 55(5–6): 525–540, 2006) was the first to derive such conditions. However, using the conditions obtained (unlike the conditions expressed in terms of proper exhausters) it was not possible to find directions of descent and ascent. In Abbasov (2011) new extremality conditions in terms of adjoint exhausters were discovered. In the present paper, a different proof of these conditions is given and it is shown how to find steepest descent and ascent conditions in terms of adjoint exhausters. The results obtained open the way to constructing numerical methods based on the usage of adjoint exhausters thus avoiding the necessity of converting the adjoint exhauster into a proper one.


Journal of Global Optimization | 2008

Exhausters, optimality conditions and related problems

Vladimir F. Demyanov; Vera Roshchina

The notions of exhausters were introduced in (Demyanov, Exhauster of a positively homogeneous function, Optimization 45, 13–29 (1999)). These dual tools (upper and lower exhausters) can be employed to describe optimality conditions and to find directions of steepest ascent and descent for a very wide range of nonsmooth functions. What is also important, exhausters enjoy a very good calculus (in the form of equalities). In the present paper we review the constrained and unconstrained optimality conditions in terms of exhausters, introduce necessary and sufficient conditions for the Lipschitzivity and Quasidifferentiability, and also present some new results on relationships between exhausters and other nonsmooth tools (such as the Clarke, Michel-Penot and Fréchet subdifferentials).


Optimization | 2011

Exact penalty functions in isoperimetric problems

Vladimir F. Demyanov; G. Sh. Tamasyan

It was earlier demonstrated, by the so-called main (or simplest) problem of the Calculus of Variations, that the Theory of Exact Penalties allows one not only to derive fundamental results of the Calculus of Variations but also to construct new direct numerical methods for solving variational problems based on the notions of subgradient and hypogradient of the exact penalty function (which is essentially nonsmooth even if all initial data are smooth). In this article Exact Penalties are used to solve isoperimetric problems of the Calculus of Variations. New direct numerical methods are described (e.g. the method of hypodifferential descent). Several numerical examples are discussed.


Optimization Methods & Software | 2002

Minmaxmin problems revisited

Alexey Demyanov; Vladimir F. Demyanov; V. N. Malozemov

The following problem is discussed: Find a (constrained or unconstrained) minimizer of the function


Journal of Global Optimization | 2000

Conditions for an Extremum in Metric Spaces

Vladimir F. Demyanov

Collaboration


Dive into the Vladimir F. Demyanov's collaboration.

Top Co-Authors

Avatar

M. E. Abbasov

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

G. Sh. Tamasyan

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

V. N. Malozemov

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Veronika V. Demyanova

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Vladimir V. Karelin

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Vera Roshchina

Federation University Australia

View shared research outputs
Top Co-Authors

Avatar

Alexey Demyanov

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Vera A. RosHchina

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge