Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where M. E. Abbasov is active.

Publication


Featured researches published by M. E. Abbasov.


Journal of Global Optimization | 2013

Proper and adjoint exhausters in nonsmooth analysis: optimality conditions

M. E. Abbasov; Vladimir F. Demyanov

The notions of upper and lower exhausters represent generalizations of the notions of exhaustive families of upper convex and lower concave approximations (u.c.a., l.c.a.). The notions of u.c.a.’s and l.c.a.’s were introduced by Pshenichnyi (Convex Analysis and Extremal Problems, Series in Nonlinear Analysis and its Applications, 1980), while the notions of exhaustive families of u.c.a.’s and l.c.a.’s were described by Demyanov and Rubinov in Nonsmooth Problems of Optimization Theory and Control, Leningrad University Press, Leningrad, 1982. These notions allow one to solve the problem of optimization of an arbitrary function by means of Convex Analysis thus essentially extending the area of application of Convex Analysis. In terms of exhausters it is possible to describe extremality conditions, and it turns out that conditions for a minimum are expressed via an upper exhauster while conditions for a maximum are formulated in terms of a lower exhauster (Abbasov and Demyanov (2010), Demyanov and Roshchina (Appl Comput Math 4(2): 114–124, 2005), Demyanov and Roshchina (2007), Demyanov and Roshchina (Optimization 55(5–6): 525–540, 2006)). This is why an upper exhauster is called a proper exhauster for minimization problems while a lower exhauster is called a proper one for maximization problems. The results obtained provide a simple geometric interpretation and allow one to construct steepest descent and ascent directions. Until recently, the problem of expressing extremality conditions in terms of adjoint exhausters remained open. Demyanov and Roshchina (Appl Comput Math 4(2): 114–124, 2005), Demyanov and Roshchina (Optimization 55(5–6): 525–540, 2006) was the first to derive such conditions. However, using the conditions obtained (unlike the conditions expressed in terms of proper exhausters) it was not possible to find directions of descent and ascent. In Abbasov (2011) new extremality conditions in terms of adjoint exhausters were discovered. In the present paper, a different proof of these conditions is given and it is shown how to find steepest descent and ascent conditions in terms of adjoint exhausters. The results obtained open the way to constructing numerical methods based on the usage of adjoint exhausters thus avoiding the necessity of converting the adjoint exhauster into a proper one.


Proceedings of the Steklov Institute of Mathematics | 2010

Extremum conditions for a nonsmooth function in terms of exhausters and coexhausters

M. E. Abbasov; Vladimir F. Demyanov

The notions of upper and lower exhausters and coexhausters are discussed and necessary conditions for an unconstrained extremum of a nonsmooth function are derived. The necessary conditions for a minimum are formulated in terms of an upper exhauster (coexhauster) and the necessary conditions for a maximum are formulated in terms of a lower exhauster (coexhauster). This involves the problem of transforming an upper exhauster (coexhauster) into a lower exhauster (coexhauster) and vice versa. The transformation is carried out by means of a conversion operation (converter). Second-order approximations obtained with the help of second-order (upper and lower) coexhausters are considered. It is shown how a secondorder upper coexhauster can be converted into a lower coexhauster and vice versa. This problem is reduced to using a first-order conversion operator but in a space of a higher dimension. The obtained result allows one to construct second-order methods for the optimization of nonsmooth functions (Newton-type methods).


Journal of Optimization Theory and Applications | 2013

Adjoint Coexhausters in Nonsmooth Analysis and Extremality Conditions

M. E. Abbasov; Vladimir F. Demyanov

In the classical (“smooth”) mathematical analysis, a differentiable function is studied by means of the derivative (gradient in the multidimensional space). In the case of nondifferentiable functions, the tools of nonsmooth analysis are to be employed. In convex analysis and minimax theory, the corresponding classes of functions are investigated by means of the subdifferential (it is a convex set in the dual space), quasidifferentiable functions are treated via the notion of quasidifferential (which is a pair of sets). To study an arbitrary directionally differentiable function, the notions of upper and lower exhausters (each of them being a family of convex sets) are used. It turns out that conditions for a minimum are described by an upper exhauster, while conditions for a maximum are stated in terms of a lower exhauster. This is why an upper exhauster is called a proper one for the minimization problem (and an adjoint exhauster for the maximization problem) while a lower exhauster will be referred to as a proper one for the maximization problem (and an adjoint exhauster for the minimization problem).The directional derivatives (and hence, exhausters) provide first-order approximations of the increment of the function under study. These approximations are positively homogeneous as functions of direction. They allow one to formulate optimality conditions, to find steepest ascent and descent directions, to construct numerical methods. However, if, for example, the maximizer of the function is to be found, but one has an upper exhauster (which is not proper for the maximization problem), it is required to use a lower exhauster. Instead, one can try to express conditions for a maximum in terms of upper exhauster (which is an adjoint one for the maximization problem). The first to get such conditions was Roshchina. New optimality conditions in terms of adjoint exhausters were recently obtained by Abbasov.The exhauster mappings are, in general, discontinuous in the Hausdorff metric, therefore, computational problems arise. To overcome these difficulties, the notions of upper and lower coexhausters are used. They provide first-order approximations of the increment of the function which are not positively homogeneous any more. These approximations also allow one to formulate optimality conditions, to find ascent and descent directions (but not the steepest ones), to construct numerical methods possessing good convergence properties. Conditions for a minimum are described in terms of an upper coexhauster (which is, therefore, called a proper coexhauster for the minimization problem) while conditions for a maximum are described in terms of a lower coexhauster (which is called a proper one for the maximization problem).In the present paper, we derive optimality conditions in terms of adjoint coexhausters.


Vestnik St. Petersburg University: Mathematics | 2018

Reduction and Minimality of Coexhausters

M. E. Abbasov

V.F. Demyanov introduced exhausters for the study of nonsmooth functions. These are families of convex compact sets that enable one to represent the main part of the increment of a considered function in a neighborhood of the studied point as MaxMin or MinMax of linear functions. Optimality conditions were described in terms of these objects. This provided a way for constructing new algorithms for solving nondifferentiable optimization problems. Exhausters are defined not uniquely. It is obvious that the smaller an exhauster, the less are the computational expenses when working with it. Thus, the problem of reduction of an available family arises. For the first time, this problem was considered by V.A. Roshchina. She proposed conditions for minimality and described some methods of reduction in the case when these conditions are not satisfied. However, it turned out that the exhauster mapping is not continuous in the Hausdorff metrics, which leads to the problems with convergence of numerical methods. To overcome this difficulty, Demyanov proposed the notion of coexhausters. These objects enable one to represent the main part of the increment of the considered function in a neighborhood of the studied point in the form of MaxMin or MinMax of affine functions. One can define a class of functions with the continuous coexhauster mapping. Optimality conditions can be stated in terms of these objects too. But coexhausters are also defined not uniquely. The problem of reduction of coexhausters is considered in this paper for the first time. Definitions of minimality proposed by Roshchina are used. In contrast to ideas proposed in the works of Roshchina, the minimality conditions and the technique of reduction developed in this paper have a clear and transparent geometric interpretation.


Vestnik St. Petersburg University: Mathematics | 2017

Charged ball method for solving some computational geometry problems

M. E. Abbasov

The concept of replacement of the initial stationary optimization problem with some nonstationary mechanical system tending with time to the position of equilibrium, which coincides with the solution of the initial problem, makes it possible to construct effective numerical algorithms. First, differential equations of the movement should be derived. Then we pass to the difference scheme and define the iteration algorithm. There is a wide class of optimization methods constructed in such a way. One of the most known representatives of this class is the heavy ball method. As a rule, such type of algorithms includes parameters that highly affect the convergence rate. In this paper, the charged ball method, belonging to this class, is proposed and investigated. It is a new effective optimization method that allows solving some computational geometry problems. A problem of orthogonal projection of a point onto a convex closed set with a smooth boundary and the problem of finding the minimum distance between two such sets are considered in detail. The convergence theorems are proved, and the estimates for the convergence rate are found. Numerical examples illustrating the work of the proposed algorithms are given.


international conference stability and control processes | 2015

New optimization algorithm for finding distance between two convex sets

M. E. Abbasov

Nature always inspires researchers of different fields. Physical analogies allow to get new efficient algorithms for various optimization problems. Well known and sufficiently studied heavy ball method is one of the representatives of such type of algorithm [1]-[8]. Another algorithm of this type is proposed in this work. It is developed for finding minimum distance between two convex sets.


Journal of Industrial and Management Optimization | 2014

Generalized exhausters: Existence, construction, optimality conditions

M. E. Abbasov


constructive nonsmooth analysis and related topics | 2017

Geometric conditions of reduction of coexhausters

M. E. Abbasov


Journal of Optimization Theory and Applications | 2017

Comparison Between Quasidifferentials and Exhausters

M. E. Abbasov


Journal of Optimization Theory and Applications | 2016

Second-Order Minimization Method for Nonsmooth Functions Allowing Convex Quadratic Approximations of the Augment

M. E. Abbasov

Collaboration


Dive into the M. E. Abbasov's collaboration.

Top Co-Authors

Avatar

Vladimir F. Demyanov

Saint Petersburg State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge