Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yaroslav D. Sergeyev is active.

Publication


Featured researches published by Yaroslav D. Sergeyev.


ACM Transactions on Mathematical Software | 2003

Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization

Marco Gaviano; Dmitri E. Kvasov; Daniela Lera; Yaroslav D. Sergeyev

A procedure for generating non-differentiable, continuously differentiable, and twice continuously differentiable classes of test functions for multiextremal multidimensional box-constrained global optimization is presented. Each test class consists of 100 functions. Test functions are generated by defining a convex quadratic function systematically distorted by polynomials in order to introduce local minima. To determine a class, the user defines the following parameters: (i) problem dimension, (ii) number of local minima, (iii) value of the global minimum, (iv) radius of the attraction region of the global minimizer, (v) distance from the global minimizer to the vertex of the quadratic function. Then, all other necessary parameters are generated randomly for all 100 functions of the class. Full information about each test function including locations and values of all local minima is supplied to the user. Partial derivatives are also generated where possible.


Archive | 2013

Introduction to Global Optimization Exploiting Space-Filling Curves

Yaroslav D. Sergeyev; Roman G. Strongin; Daniela Lera

Introduction to Global Optimization Exploiting Space-Filling Curves provides an overview of classical and new results pertaining to the usage of space-filling curves in global optimization. The authors look at a family of derivative-free numerical algorithms applying space-filling curves to reduce the dimensionality of the global optimization problem; along with a number of unconventional ideas, such as adaptive strategies for estimating Lipschitz constant, balancing global and local information to accelerate the search. Convergence conditions of the described algorithms are studied in depth and theoretical considerations are illustrated through numerical examples. This work also contains a code for implementing space-filling curves that can be used for constructing new global optimization algorithms. Basic ideas from this text can be applied to a number of problems including problems with multiextremal and partially defined constraints and non-redundant parallel computations can be organized. Professors, students, researchers, engineers, and other professionals in the fields of pure mathematics, nonlinear sciences studying fractals, operations research, management science, industrial and applied mathematics, computer science, engineering, economics, and the environmental sciences will find this title useful .


Siam Journal on Optimization | 1995

An Information Global Optimization Algorithm with Local Tuning

Yaroslav D. Sergeyev

We propose an algorithm using only the values of the objective function for solving unconstrained global optimization problems. This algorithm belongs to the class of the information methods introduced by Strongin [Numerical Methods in Multiextremal Problems, Nauka, Moskow,1978] and differs from the other algorithms of this class by the presence of local tuning which spies on the changes of the Lipschitz constant of the objective function over different sectors of the search region. We describe two versions of the method: for solving one-dimensional problems and for solving multidimensional problems (using Peano-type space-filling curves for reduction of imensionality). In both cases we establish sufficient conditions of convergence to the global minimum. We also report results of some numerical experiments.


Journal of Global Optimization | 2014

Globally-biased Disimpl algorithm for expensive global optimization

Remigijus Paulavičius; Yaroslav D. Sergeyev; Dmitri E. Kvasov; Julius Žilinskas

Direct-type global optimization algorithms often spend an excessive number of function evaluations on problems with many local optima exploring suboptimal local minima, thereby delaying discovery of the global minimum. In this paper, a globally-biased simplicial partition Disimpl algorithm for global optimization of expensive Lipschitz continuous functions with an unknown Lipschitz constant is proposed. A scheme for an adaptive balancing of local and global information during the search is introduced, implemented, experimentally investigated, and compared with the well-known Direct and Directl methods. Extensive numerical experiments executed on 800 multidimensional multiextremal test functions show a promising performance of the new acceleration technique with respect to competitors.


Mathematical Programming | 1998

Global one-dimensional optimization using smooth auxiliary functions

Yaroslav D. Sergeyev

In this paper new global optimization algorithms are proposed for solving problems where the objective function is univariate and has Lipschitzean first derivatives. To solve this problem, smooth auxiliary functions, which are adaptively improved during the course of the search, are constructed. Three new algorithms are introduced: the first used the exact a priori known Lipschitz constant for derivatives; the second, when this constant is unknown, estimates it during the course of the search and finally, the last method uses neither the exact global Lipschitz constant nor its estimate but instead adaptively estimates the local Lipschitz constants in different sectors of the search region during the course of optimization. Convergence conditions of the methods are investigated from a general viewpoint and some numerical results are also given.


Siam Journal on Optimization | 2013

Acceleration of Univariate Global Optimization Algorithms Working with Lipschitz Functions and Lipschitz First Derivatives

Daniela Lera; Yaroslav D. Sergeyev

This paper deals with two kinds of the one-dimensional global optimization problem over a closed finite interval: (i) the objective function


Optimization Letters | 2009

A univariate global search working with a set of Lipschitz constants for the first derivative

Dmitri E. Kvasov; Yaroslav D. Sergeyev

f(x)


Journal of Global Optimization | 1997

Parallel Characteristical Algorithms for Solving Problems of GlobalOptimization

Vladimir A. Grishagin; Yaroslav D. Sergeyev; Roman G. Strongin

satisfies the Lipschitz condition with a constant


Optimization Letters | 2011

Higher order numerical differentiation on the Infinity Computer

Yaroslav D. Sergeyev

L


parallel computing | 1992

Global multidimensional optimization on parallel computer

Roman G. Strongin; Yaroslav D. Sergeyev

; (ii) the first derivative of

Collaboration


Dive into the Yaroslav D. Sergeyev's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Clara Pizzuti

Nuclear Regulatory Commission

View shared research outputs
Top Co-Authors

Avatar

Domenico Famularo

Indian Council of Agricultural Research

View shared research outputs
Researchain Logo
Decentralizing Knowledge