Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marco Gaviano is active.

Publication


Featured researches published by Marco Gaviano.


ACM Transactions on Mathematical Software | 2003

Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization

Marco Gaviano; Dmitri E. Kvasov; Daniela Lera; Yaroslav D. Sergeyev

A procedure for generating non-differentiable, continuously differentiable, and twice continuously differentiable classes of test functions for multiextremal multidimensional box-constrained global optimization is presented. Each test class consists of 100 functions. Test functions are generated by defining a convex quadratic function systematically distorted by polynomials in order to introduce local minima. To determine a class, the user defines the following parameters: (i) problem dimension, (ii) number of local minima, (iii) value of the global minimum, (iv) radius of the attraction region of the global minimizer, (v) distance from the global minimizer to the vertex of the quadratic function. Then, all other necessary parameters are generated randomly for all 100 functions of the class. Full information about each test function including locations and values of all local minima is supplied to the user. Partial derivatives are also generated where possible.


Journal of Global Optimization | 1998

Test Functions with Variable Attraction Regions for GlobalOptimization Problems

Marco Gaviano; Daniela Lera

Functions with local minima and size of their ‘region of attraction’ known a priori, are often needed for testing the performance of algorithms that solve global optimization problems. In this paper we investigate a technique for constructing test functions for global optimization problems for which we fix a priori: (i) the problem dimension, (ii) the number of local minima, (iii) the local minima points, (iv) the function values of the local minima. Further, the size of the region of attraction of each local minimum may be made large or small. The technique consists of first constructing a convex quadratic function and then systematically distorting selected parts of this function so as to introduce local minima.


Journal of Global Optimization | 2010

A local search method for continuous global optimization

Marco Gaviano; Daniela Lera; A. M. Steri

A large number of algorithms introduced in the literature to find the global minimum of a real function rely on iterative executions of searches of a local minimum. Multistart, tunneling and some versions of simulated annealing are methods that produce well-known procedures. A crucial point of these algorithms is to decide whether to perform or not a new local search. In this paper we look for the optimal probability value to be set at each iteration so that by moving from a local minimum to a new one, the average number of function evaluations evals is minimal. We find that this probability has to be 0 or 1 depending on the number of function evaluations required by the local search and by the size of the level set at the current point. An implementation based on the above result is introduced. The values required to calculate evals are estimated from the history of the algorithm at running time. The algorithm has been tested both for sample problems constructed by the GKLS package and for problems often used in the literature. The outcome is compared with recent results.


Optimization Methods & Software | 2002

A complexity analysis of local search algorithms in global optimization

Marco Gaviano; Daniela Lera

In this paper, a complexity analysis is performed for an algorithm which solves the global optimization problem (P): min f ( x ), x ] S ;


Numerical Algorithms | 2012

Properties and numerical testing of a parallel global optimization algorithm

Marco Gaviano; Daniela Lera

f:\ S \longrightarrow {\shadR}, \ S \subset {\shadR}^n


Journal of Optimization Theory and Applications | 1998

On linear convergence of gradient-type minimization algorithms

Marco Gaviano; Daniela Lera

. The algorithm yields local minimizations of (P) starting from points chosen at random in S , and then the global minimum is selected among the local ones found. The local search algorithm is based on the steepest-descent method combined either with an exact line search or with the Armijo procedure. In the first case it is found that the number of line searches required to solve (P) within a fixed accuracy depends linearly on the problem dimension; in the second case a similar result involving the number of function evaluations is established.


Optimization Methods & Software | 2005

Complexity of general continuous minimization problems: a survey

Marco Gaviano; Daniela Lera

In the framework of multistart and local search algorithms that find the global minimum of a real function f(x),


Optimization Methods & Software | 1995

On the convergence behavior of the truncated GMRES method for nonlinear equations

Marco Gaviano; Huang Zhijian

x \in S\subseteq R^n


WSEAS Transactions on Information Science and Applications archive | 2009

Shape matching by curve modelling and alignment

Cecilia Di Ruberto; Marco Gaviano; Andrea Morgera

, Gaviano et alias proposed a rule for deciding, as soon as a local minimum has been found, whether to perform or not a new local minimization. That rule was designed to minimize the average local computational cost


Applied Mathematics-a Journal of Chinese Universities Series B | 2012

A Parallel Algorithm for Global Optimization Problems in a Distribuited Computing Environment

Marco Gaviano; Daniela Lera; Elisabetta Mereu

eval_1(\cdotp)

Collaboration


Dive into the Marco Gaviano's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

A. M. Steri

University of Cagliari

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge