Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Maria Grazia Gasparo is active.

Publication


Featured researches published by Maria Grazia Gasparo.


Optimization Methods & Software | 2000

A nonmonotone hybrid method for nonlinear systems

Maria Grazia Gasparo

A nonmonotone hybrid method for solving nonhnear systems is proposed. The idea consists in matching a Newton-like method stabilized by nonmonotone line-search with a cheap direct search method that reduces monotonically the merit function. The aim is to enlarge the convergence domain of the Newton-like method without either losing its local convergence rate or introducing too much additional computational work. The proposed method has been extensively tested on a number of problems and compared with well assessed methods for nonhnear systems. The results suggest that the potential of the proposed approach is sometimes considerable


Applied Numerical Mathematics | 2003

Global Newton-type methods and semismooth reformulations for NCP

Sandra Pieraccini; Maria Grazia Gasparo; Aldo Pasquali

It is known that a nonlinear complementarity problem (NCP) can be reformulated as a semismooth system of nonlinear equations by using a so-called NCP-function. Global Newton-type methods for solving NCP via semismooth reformulation need to use a merit function, which is usually required to be continuously differentiable. In this paper we present a global Newton-type method which does not require the differentiability for the merit function used in the line-search procedure. The method is used to numerically compare the effectiveness of two NCP-functions widely discussed in literature, the minimum function and the Fischer-Burmeister function. The results on several examples allow to gain some new acquaintance of the respective numerical advantages of the two functions.


Journal of Optimization Theory and Applications | 2000

Inexact methods: forcing terms and conditioning

Maria Grazia Gasparo; Benedetta Morini

In this paper, we consider inexact Newton and Newton-like methods andprovide new convergence conditions relating the forcing terms to theconditioning of the iteration matrices. These results can be exploited wheninexact methods with iterative linear solvers are used. In this framework,preconditioning techniques can be used to improve the performance ofiterative linear solvers and to avoid the need of excessively small forcingterms. Numerical experiments validating the theoretical results arediscussed.


Journal of Computational and Applied Mathematics | 1996

A switching-method for nonlinear systems

Stefania Bellavia; Maria Grazia Gasparo; Maria Macconi

A new iterative method is proposed for the solution of nonlinear systems. The method does not use explicit derivative informations and at each iteration automatically selects one of two distinct iterative schemes: a direct search method and a damped approximate Newtons method. So, the method is referred as Switching-Method. It is shown that the method is a global method with quadratic convergence. Numerical results show the very good practical performance of the method.


iberoamerican congress on pattern recognition | 2010

A new algorithm for training SVMs using approximate minimal enclosing balls

Emanuele Frandi; Maria Grazia Gasparo; Stefano Lodi; Ricardo Ñanculef; Claudio Sartori

It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball (MEB) problems in a certain feature space. Exploiting this reduction, efficient algorithms to scale up Support Vector Machines (SVMs) and other kernel methods have been introduced under the name of Core Vector Machines (CVMs). In this paper, we study a new algorithm to train SVMs based on an instance of the Frank-Wolfe optimization method recently proposed to approximate the solution of the MEB problem. We show that, specialized to SVM training, this algorithm can scale better than CVMs at the price of a slightly lower accuracy.


Siam Journal on Optimization | 2009

Generating Set Search Methods for Piecewise Smooth Problems

C. Bogani; Maria Grazia Gasparo; Alessandra Papini

We consider a direct search approach for solving nonsmooth minimization problems where the objective function is locally Lipschitz continuous and piecewise continuously differentiable on a finite family of polyhedra. A generating set search method is proposed, which is named structured because the structure of the set of nondifferentiability near the current iterate is exploited to define the search directions at each iteration. Some numerical results are presented to validate the approach.


international conference on computational science | 2006

Path following by SVD

Luca Dieci; Maria Grazia Gasparo; Alessandra Papini

In this paper, we propose a path-following method for computing a curve of equilibria of a dynamical system, based upon the smooth Singular Value Decomposition (SVD) of the Jacobian matrix. Our method is capable of detecting fold points, and continuing past folds. It is also able to detect branch points and to switch branches at such points. Algorithmic details and examples are given.


International Journal of Pattern Recognition and Artificial Intelligence | 2013

TRAINING SUPPORT VECTOR MACHINES USING FRANK–WOLFE OPTIMIZATION METHODS

Emanuele Frandi; Ricardo Ñanculef; Maria Grazia Gasparo; Stefano Lodi; Claudio Sartori

Training a support vector machine (SVM) requires the solution of a quadratic programming problem (QP) whose computational complexity becomes prohibitively expensive for large scale datasets. Traditional optimization methods cannot be directly applied in these cases, mainly due to memory restrictions. By adopting a slightly different objective function and under mild conditions on the kernel used within the model, efficient algorithms to train SVMs have been devised under the name of core vector machines (CVMs). This framework exploits the equivalence of the resulting learning problem with the task of building a minimal enclosing ball (MEB) problem in a feature space, where data is implicitly embedded by a kernel function. In this paper, we improve on the CVM approach by proposing two novel methods to build SVMs based on the Frank–Wolfe algorithm, recently revisited as a fast method to approximate the solution of a MEB problem. In contrast to CVMs, our algorithms do not require to compute the solutions of a sequence of increasingly complex QPs and are defined by using only analytic optimization steps. Experiments on a large collection of datasets show that our methods scale better than CVMs in most cases, sometimes at the price of a slightly lower accuracy. As CVMs, the proposed methods can be easily extended to machine learning problems other than binary classification. However, effective classifiers are also obtained using kernels which do not satisfy the condition required by CVMs, and thus our methods can be used for a wider set of problems.


Optimization Methods & Software | 2002

An Infeasible Interior-Point Method with Nonmonotonic Complementarity Gaps

Maria Grazia Gasparo; Sandra Pieraccini; Alessandro Armellini

This article describes an infeasible interior-point (IP) method for solving monotone variational inequality problems with polyhedral constraints and, as a particular case, monotone nonlinear complementarity problems. The method determines a search direction by solving, possibly in an inexact way, the Newton equation for the central path. Then, a curvilinear search is used to meet classical centering conditions and an Armijo rule. The novelty with respect to classical IP methods consists in relaxing the requirement of monotonic decrease in the complementarity gaps. Global convergence results are proved and numerical experiments are presented. The experiments confirm the effectiveness of nonmonotonicity in a number of test problems.


Numerical Functional Analysis and Optimization | 2008

SOME PROPERTIES OF GMRES IN HILBERT SPACES

Maria Grazia Gasparo; Alessandra Papini; Aldo Pasquali

Our purpose in this work is to explore the properties of GMRES in Hilbert spaces. We extend to the infinite dimensional context some main results that are known to hold in the finite dimensional case. A key assumption for these extensions is that the involved linear operator is an algebraic operator.

Collaboration


Dive into the Maria Grazia Gasparo's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

C. Bogani

University of Florence

View shared research outputs
Top Co-Authors

Avatar

Emanuele Frandi

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Luca Dieci

Georgia Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge