Dmitri E. Kvasov
University of Calabria
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dmitri E. Kvasov.
ACM Transactions on Mathematical Software | 2003
Marco Gaviano; Dmitri E. Kvasov; Daniela Lera; Yaroslav D. Sergeyev
A procedure for generating non-differentiable, continuously differentiable, and twice continuously differentiable classes of test functions for multiextremal multidimensional box-constrained global optimization is presented. Each test class consists of 100 functions. Test functions are generated by defining a convex quadratic function systematically distorted by polynomials in order to introduce local minima. To determine a class, the user defines the following parameters: (i) problem dimension, (ii) number of local minima, (iii) value of the global minimum, (iv) radius of the attraction region of the global minimizer, (v) distance from the global minimizer to the vertex of the quadratic function. Then, all other necessary parameters are generated randomly for all 100 functions of the class. Full information about each test function including locations and values of all local minima is supplied to the user. Partial derivatives are also generated where possible.
Journal of Global Optimization | 2014
Remigijus Paulavičius; Yaroslav D. Sergeyev; Dmitri E. Kvasov; Julius Žilinskas
Direct-type global optimization algorithms often spend an excessive number of function evaluations on problems with many local optima exploring suboptimal local minima, thereby delaying discovery of the global minimum. In this paper, a globally-biased simplicial partition Disimpl algorithm for global optimization of expensive Lipschitz continuous functions with an unknown Lipschitz constant is proposed. A scheme for an adaptive balancing of local and global information during the search is introduced, implemented, experimentally investigated, and compared with the well-known Direct and Directl methods. Extensive numerical experiments executed on 800 multidimensional multiextremal test functions show a promising performance of the new acceleration technique with respect to competitors.
Optimization Letters | 2009
Dmitri E. Kvasov; Yaroslav D. Sergeyev
In the paper, a global optimization problem is considered where the objective function f (x) is univariate, black-box, and its first derivative f ′(x) satisfies the Lipschitz condition with an unknown Lipschitz constant K. In the literature, there exist methods solving this problem by using an a priori given estimate of K, its adaptive estimates, and adaptive estimates of local Lipschitz constants. Algorithms working with a number of Lipschitz constants for f ′(x) chosen from a set of possible values are not known in spite of the fact that a method working in this way with Lipschitz objective functions, DIRECT, has been proposed in 1993. A new geometric method evolving its ideas to the case of the objective function having a Lipschitz derivative is introduced and studied in this paper. Numerical experiments executed on a number of test functions show that the usage of derivatives allows one to obtain, as it is expected, an acceleration in comparison with the DIRECT algorithm.
Numerische Mathematik | 2003
Dmitri E. Kvasov; Clara Pizzuti; Yaroslav D. Sergeyev
Summary. In this paper, global optimization (GO) Lipschitz problems are considered where the multi-dimensional multiextremal objective function is determined over a hyperinterval. An efficient one-dimensional GO method using local tuning on the behavior of the objective function is generalized to the multi-dimensional case by the diagonal approach using two partition strategies. Global convergence conditions are established for the obtained diagonal geometric methods. Results of a wide numerical comparison show a strong acceleration reached by the new methods working with estimates of the local Lipschitz constants over different subregions of the search domain in comparison with the traditional approach.
Advances in Engineering Software | 2015
Dmitri E. Kvasov; Yaroslav D. Sergeyev
In many important design problems, some decisions should be made by finding the global optimum of a multiextremal objective function subject to a set of constrains. Frequently, especially in engineering applications, the functions involved in optimization process are black-box with unknown analytical representations and hard to evaluate. Such computationally challenging decision-making problems often cannot be solved by traditional optimization techniques based on strong suppositions about the problem (convexity, differentiability, etc.). Nature and evolutionary inspired metaheuristics are also not always successful in finding global solutions to these problems due to their multiextremal character. In this paper, some innovative and powerful deterministic approaches developed by the authors to construct numerical methods for solving the mentioned problems are surveyed. Their efficiency is shown on solving both the classes of random test problems and some practical engineering tasks.
Communications in Nonlinear Science and Numerical Simulation | 2015
Yaroslav D. Sergeyev; Dmitri E. Kvasov
In many practical decision-making problems it happens that functions involved in optimization process are black-box with unknown analytical representations and hard to evaluate. In this paper, a global optimization problem is considered where both the goal function f (x) and its gradient f � (x )a re black-box functions. It is supposed that f � (x) satisfies the Lipschitz condition over the search hyperinterval with an unknown Lipschitz constant K. A new deterministic ‘Divide-the-Best’ algorithm based on efficient diagonal partitions and smooth auxiliary functions is proposed in its basic version, its convergence conditions are studied and numerical experiments executed on eight hundred test functions are presented.
Automation and Remote Control | 2013
Dmitri E. Kvasov; Yaroslav D. Sergeyev
Many control problems involve the search for the global extremum in the space of states or the parameters of the system under study, which leads to the necessity of using effective methods of global finite-dimensional optimization. For this purpose use can be made of the geometric algorithms of Lipschitz global optimization, which are developed by the authors. A brief review of these algorithms is presented and they are compared with some algorithms of global search that are often used in technical practice. Numerical experiments are performed on a few known examples of applied multiextremal problems.
Optimization Letters | 2006
Yaroslav D. Sergeyev; Dmitri E. Kvasov; Falah M. H. Khalaf
Lipschitz one-dimensional constrained global optimization (GO) problems where both the objective function and constraints can be multiextremal and non-differentiable are considered in this paper. Problems, where the constraints are verified in a priori given order fixed by the nature of the problem are studied. Moreover, if a constraint is not satisfied at a point, then the remaining constraints and the objective function can be undefined at this point. The constrained problem is reduced to a discontinuous unconstrained problem by the index scheme without introducing additional parameters or variables. A new geometric method using adaptive estimates of local Lipschitz constants is introduced. The estimates are calculated by using the local tuning technique proposed recently. Numerical experiments show quite a satisfactory performance of the new method in comparison with the penalty approach and a method using a priori given Lipschitz constants.
Journal of Optimization Theory and Applications | 2016
Yaroslav D. Sergeyev; Marat S. Mukhametzhanov; Dmitri E. Kvasov; Daniela Lera
Geometric and information frameworks for constructing global optimization algorithms are considered, and several new ideas to speed up the search are proposed. The accelerated global optimization methods automatically realize a local behavior in the promising subregions without the necessity to stop the global optimization procedure. Moreover, all the trials executed during the local phases are used also in the course of the global ones. The resulting geometric and information global optimization methods have a similar structure, and a smart mixture of new and traditional computational steps leads to 22 different global optimization algorithms. All of them are studied and numerically compared on three test sets including 120 benchmark functions and 4 applied problems.
Mathematics and Computers in Simulation | 2017
Yaroslav D. Sergeyev; Dmitri E. Kvasov; Marat S. Mukhametzhanov
Univariate continuous global optimization problems are considered in this paper. Several widely used multidimensional metaheuristic global optimization methods–genetic algorithm, differential evolution, particle swarm optimization, artificial bee colony algorithm, and firefly algorithm–are adapted to the univariate case and compared with three Lipschitz global optimization algorithms. For this purpose, it has been introduced a methodology allowing one to compare stochastic methods with deterministic ones by using operational characteristics originally proposed for working with deterministic algorithms only. As a result, a visual comparison of methods having different nature on classes of randomly generated test functions becomes possible. A detailed description of the new methodology for comparing, called “operational zones”, is given and results of wide numerical experiments with five metaheuristics and three Lipschitz algorithms are reported.