Marco Gaviano
University of Cagliari
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Marco Gaviano.
ACM Transactions on Mathematical Software | 2003
Marco Gaviano; Dmitri E. Kvasov; Daniela Lera; Yaroslav D. Sergeyev
A procedure for generating non-differentiable, continuously differentiable, and twice continuously differentiable classes of test functions for multiextremal multidimensional box-constrained global optimization is presented. Each test class consists of 100 functions. Test functions are generated by defining a convex quadratic function systematically distorted by polynomials in order to introduce local minima. To determine a class, the user defines the following parameters: (i) problem dimension, (ii) number of local minima, (iii) value of the global minimum, (iv) radius of the attraction region of the global minimizer, (v) distance from the global minimizer to the vertex of the quadratic function. Then, all other necessary parameters are generated randomly for all 100 functions of the class. Full information about each test function including locations and values of all local minima is supplied to the user. Partial derivatives are also generated where possible.
Journal of Global Optimization | 1998
Marco Gaviano; Daniela Lera
Functions with local minima and size of their ‘region of attraction’ known a priori, are often needed for testing the performance of algorithms that solve global optimization problems. In this paper we investigate a technique for constructing test functions for global optimization problems for which we fix a priori: (i) the problem dimension, (ii) the number of local minima, (iii) the local minima points, (iv) the function values of the local minima. Further, the size of the region of attraction of each local minimum may be made large or small. The technique consists of first constructing a convex quadratic function and then systematically distorting selected parts of this function so as to introduce local minima.
Journal of Global Optimization | 2010
Marco Gaviano; Daniela Lera; A. M. Steri
A large number of algorithms introduced in the literature to find the global minimum of a real function rely on iterative executions of searches of a local minimum. Multistart, tunneling and some versions of simulated annealing are methods that produce well-known procedures. A crucial point of these algorithms is to decide whether to perform or not a new local search. In this paper we look for the optimal probability value to be set at each iteration so that by moving from a local minimum to a new one, the average number of function evaluations evals is minimal. We find that this probability has to be 0 or 1 depending on the number of function evaluations required by the local search and by the size of the level set at the current point. An implementation based on the above result is introduced. The values required to calculate evals are estimated from the history of the algorithm at running time. The algorithm has been tested both for sample problems constructed by the GKLS package and for problems often used in the literature. The outcome is compared with recent results.
Optimization Methods & Software | 2002
Marco Gaviano; Daniela Lera
In this paper, a complexity analysis is performed for an algorithm which solves the global optimization problem (P): min f ( x ), x ] S ;
Numerical Algorithms | 2012
Marco Gaviano; Daniela Lera
f:\ S \longrightarrow {\shadR}, \ S \subset {\shadR}^n
Journal of Optimization Theory and Applications | 1998
Marco Gaviano; Daniela Lera
. The algorithm yields local minimizations of (P) starting from points chosen at random in S , and then the global minimum is selected among the local ones found. The local search algorithm is based on the steepest-descent method combined either with an exact line search or with the Armijo procedure. In the first case it is found that the number of line searches required to solve (P) within a fixed accuracy depends linearly on the problem dimension; in the second case a similar result involving the number of function evaluations is established.
Optimization Methods & Software | 2005
Marco Gaviano; Daniela Lera
In the framework of multistart and local search algorithms that find the global minimum of a real function f(x),
Optimization Methods & Software | 1995
Marco Gaviano; Huang Zhijian
x \in S\subseteq R^n
WSEAS Transactions on Information Science and Applications archive | 2009
Cecilia Di Ruberto; Marco Gaviano; Andrea Morgera
, Gaviano et alias proposed a rule for deciding, as soon as a local minimum has been found, whether to perform or not a new local minimization. That rule was designed to minimize the average local computational cost
Applied Mathematics-a Journal of Chinese Universities Series B | 2012
Marco Gaviano; Daniela Lera; Elisabetta Mereu
eval_1(\cdotp)