Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Enrico Gorgone is active.

Publication


Featured researches published by Enrico Gorgone.


Mathematical Programming | 2014

Bundle methods for sum-functions with easy components: applications to multicommodity network design

Antonio Frangioni; Enrico Gorgone

We propose a version of the bundle scheme for convex nondifferentiable optimization suitable for the case of a sum-function where some of the components are “easy”, that is, they are Lagrangian functions of explicitly known compact convex programs. This corresponds to a stabilized partial Dantzig–Wolfe decomposition, where suitably modified representations of the “easy” convex subproblems are inserted in the master problem as an alternative to iteratively inner-approximating them by extreme points, thus providing the algorithm with exact information about a part of the dual objective function. The resulting master problems are potentially larger and less well-structured than the standard ones, ruling out the available specialized techniques and requiring the use of general-purpose solvers for their solution; this strongly favors piecewise-linear stabilizing terms, as opposed to the more usual quadratic ones, which in turn may have an adverse effect on the convergence speed of the algorithm, so that the overall performance may depend on appropriate tuning of all these aspects. Yet, very good computational results are obtained in at least one relevant application: the computation of tight lower bounds for Fixed-Charge Multicommodity Min-Cost Flow problems.


Optimization Methods & Software | 2008

Non-smoothness in classification problems

Annabella Astorino; Antonio Fuduli; Enrico Gorgone

We review the role played by non-smooth optimization techniques in many recent applications in classification area. Starting from the classical concept of linear separability in binary classification, we recall the more general concepts of polyhedral, ellipsoidal and max–min separability. Finally we focus our attention on the support vector machine (SVM) approach and on the more recent transductive SVM technique.


Siam Journal on Optimization | 2011

Piecewise-quadratic Approximations in Convex Numerical Optimization

Annabella Astorino; Antonio Frangioni; Manlio Gaudioso; Enrico Gorgone

We present a bundle method for convex nondifferentiable minimization where the model is a piecewise-quadratic convex approximation of the objective function. Unlike standard bundle approaches, the model only needs to support the objective function from below at a properly chosen (small) subset of points, as opposed to everywhere. We provide the convergence analysis for the algorithm, with a general form of master problem which combines features of trust region stabilization and proximal stabilization, taking care of all the important practical aspects such as proper handling of the proximity parameters and the bundle of information. Numerical results are also reported.


Numerische Mathematik | 2009

Piecewise linear approximations in nonconvex nonsmooth optimization

Manlio Gaudioso; Enrico Gorgone; Maria Flavia Monaco

We present a bundle type method for minimizing nonconvex nondifferentiable functions of several variables. The algorithm is based on the construction of both a lower and an upper polyhedral approximation of the objective function. In particular, at each iteration, a search direction is computed by solving a quadratic program aiming at maximizing the difference between the lower and the upper model. A proximal approach is used to guarantee convergence to a stationary point under the hypothesis of weak semismoothness.


Siam Journal on Optimization | 2013

A Nonmonotone Proximal Bundle Method with (Potentially) Continuous Step Decisions

Annabella Astorino; Antonio Frangioni; Antonio Fuduli; Enrico Gorgone

We present a convex nondifferentiable minimization algorithm of proximal bundle type that does not rely on measuring descent of the objective function to declare the so-called serious steps; rather, a merit function is defined which is decreased at each iteration, leading to a (potentially) continuous choice of the stepsize between zero (the null step) and one (the serious step). By avoiding the discrete choice the convergence analysis is simplified, and we can more easily obtain efficiency estimates for the method. Some choices for the step selection actually reproduce the dichotomic behavior of standard proximal bundle methods but shed new light on the rationale behind the process, and ultimately with different rules; furthermore, using nonlinear upper models of the function in the step selection process can lead to actual fractional steps.


Optimization Methods & Software | 2010

Gradient set splitting in nonconvex nonsmooth numerical optimization

Manlio Gaudioso; Enrico Gorgone

We present a numerical bundle-type method for local minimization of a real function of several variables, which is supposed to be locally Lipschitz. We provide a short survey of some optimization algorithms from the literature, which are able to deal with both nonsmoothness and nonconvexity of the objective function. We focus on possible extensions of classical bundle-type methods, originally conceived to deal with convex nonsmooth optimization. They are all based on a convex cutting plane model which has the property of both minorizing everywhere and interpolating at certain points the objective function. Such properties may be lost whenever nonconvexity is present and the case may be described in terms of possible negative values of certain linearization errors. We describe some alternative ways the problem is dealt with in the literature. Here, on the basis of a classification of the limit points of gradient sequences, we define two distinct cutting plane approximations. We derive an algorithm which uses both such models. In particular, only the convex model is primarily adopted to find a tentative displacement from the current stability centre, while the concave one enters into the play only when the convex model has failed in providing a sufficient decrease step. Termination of the method is proved and the results of some numerical experiments are reported.


Optimization | 2011

Data preprocessing in semi-supervised SVM classification

A. Astorino; Enrico Gorgone; Manlio Gaudioso; Diethard Pallaschke

The literature in the area of the semi-supervised binary classification has demonstrated that useful information can be gathered not only from those samples whose class membership is known in advance, but also from the unlabelled ones. In fact, in the support vector machine, semi-supervised models with both labelled and unlabelled samples contribute to the definition of an appropriate optimization model for finding a good quality separating hyperplane. In particular, the optimization approaches which have been devised in this context are basically of two types: a mixed integer linear programming problem, and a continuous optimization problem characterized by an objective function which is nonsmooth and nonconvex. Both such problems are hard to solve whenever the number of the unlabelled points increases. In this article, we present a data preprocessing technique which has the objective of reducing the number of unlabelled points to enter the computational model, without worsening too much the classification performance of the overall process. The approach is based on the concept of separating sets and can be implemented with a reasonable computational effort. The results of the numerical experiments on several benchmark datasets are also reported.


European Journal of Operational Research | 2013

A Library for Continuous Convex Separable Quadratic Knapsack Problems

Antonio Frangioni; Enrico Gorgone

The Continuous Convex Separable Quadratic Knapsack problem (CQKnP) is an easy but useful model that has very many different applications. Although the problem can be solved quickly, it must typically be solved very many times within approaches to (much) more difficult models; hence an efficient solution approach is required. We present and discuss a small open-source library for its solution that we have recently developed and distributed.


Optimization | 2010

Separation of convex sets by Clarke subdifferential

Manlio Gaudioso; Enrico Gorgone; Diethard Pallaschke

In this article we consider a separation technique proposed in J. Grzybowski, D. Pallaschke, and R. Urbański (A pre-classification and the separation law for closed bounded convex sets, Optim. Method Softw. 20(2005), pp. 219–229) for separating two convex sets A and B with another convex set C. We prove that in a finite dimension C can be chosen as the Clarke subdifferential at the origin of , where pA , pB denotes the support functions of A and B respectively.


Mathematical Programming Computation | 2017

On the computational efficiency of subgradient methods: a case study with Lagrangian bounds

Antonio Frangioni; Bernard Gendron; Enrico Gorgone

Subgradient methods (SM) have long been the preferred way to solve the large-scale Nondifferentiable Optimization problems arising from the solution of Lagrangian Duals (LD) of Integer Programs (IP). Although other methods can have better convergence rate in practice, SM have certain advantages that may make them competitive under the right conditions. Furthermore, SM have significantly progressed in recent years, and new versions have been proposed with better theoretical and practical performances in some applications. We computationally evaluate a large class of SM in order to assess if these improvements carry over to the IP setting. For this we build a unified scheme that covers many of the SM proposed in the literature, comprised some often overlooked features like projection and dynamic generation of variables. We fine-tune the many algorithmic parameters of the resulting large class of SM, and we test them on two different LDs of the Fixed-Charge Multicommodity Capacitated Network Design problem, in order to assess the impact of the characteristics of the problem on the optimal algorithmic choices. Our results show that, if extensive tuning is performed, SM can be competitive with more sophisticated approaches when the tolerance required for solution is not too tight, which is the case when solving LDs of IPs.

Collaboration


Dive into the Enrico Gorgone's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Annabella Astorino

Nuclear Regulatory Commission

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Diethard Pallaschke

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Martine Labbé

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge