Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Giovanni Rinaldi is active.

Publication


Featured researches published by Giovanni Rinaldi.


Siam Review | 1991

A branch-and-cut algorithm for the resolution of large-scale symmetric traveling salesman problems

Manfred W. Padberg; Giovanni Rinaldi

An algorithm is described for solving large-scale instances of the Symmetric Traveling Salesman Problem (STSP) to optimality. The core of the algorithm is a “polyhedral” cutting-plane procedure that exploits a subset of the system of linear inequalities defining the convex hull of the incidence vectors of the hamiltonian cycles of a complete graph. The cuts are generated by several identification procedures that have been described in a companion paper. Whenever the cutting-plane procedure does not terminate with an optimal solution the algorithm uses a tree-search strategy that, as opposed to branch-and-bound, keeps on producing cuts after branching. The algorithm has been implemented in FORTRAN. Two different linear programming (LP) packages have been used as the LP solver. The implementation of the algorithm and the interface with one of the LP solvers is described in sufficient detail to permit the replication of our experiments. Computational results are reported with up to 42 STSPs with sizes rangin...


Operations Research Letters | 1990

Addendum: Optimization of a 532-city symmetric traveling salesman problem by branch and cut

Manfred Padberg; Giovanni Rinaldi

We report the solution to optimality of a 532-city symmetric traveling salesman problem involving the optimization over 141,246 zero-one variables. The results of an earlier study by Crowder and Padberg [1] are cross-validated. In this note we briefly outline the methodology, algorithms and software system that we developed to obtain these results.


Archive | 2010

50 Years of Integer Programming 1958-2008

Michael Jünger; Thomas M. Liebling; Denis Naddef; George L. Nemhauser; William R. Pulleyblank; Gerhard Reinelt; Giovanni Rinaldi; Laurence A. Wolsey

I The Early Years.- Solution of a Large-Scale Traveling-Salesman Problem.- The Hungarian Method for the Assignment Problem.- Integral Boundary Points of Convex Polyhedra.- Outline of an Algorithm for Integer Solutions to Linear Programs An Algorithm for the Mixed Integer Problem.- An Automatic Method for Solving Discrete Programming Problems.- Integer Programming: Methods, Uses, Computation.- Matroid Partition.- Reducibility Among Combinatorial Problems.- Lagrangian Relaxation for Integer Programming.- Disjunctive Programming.- II From the Beginnings to the State-of-the-Art.- Polyhedral Approaches to Mixed Integer Linear Programming.- Fifty-Plus Years of Combinatorial Integer Programming.- Reformulation and Decomposition of Integer Programs.- III Current Topics.- Integer Programming and Algorithmic Geometry of Numbers.- Nonlinear Integer Programming.- Mixed Integer Programming Computation.- Symmetry in Integer Linear Programming.- Semidefinite Relaxations for Integer Programming.- The Group-Theoretic Approach in Mixed Integer Programming.


Mathematical Programming | 1990

Facet identification for the symmetric traveling salesman polytope

Manfred W. Padberg; Giovanni Rinaldi

Several procedures for the identification of facet inducing inequalities for the symmetric traveling salesman polytope are given. An identification procedure accepts as input the support graph of a point which does not belong to the polytope, and returns as output some of the facet inducing inequalities violated by the point. A procedure which always accomplishes this task is calledexact, otherwise it is calledheuristic. We give exact procedures for the subtour elimination and the 2-matching constraints, based on the Gomory—Hu and Padberg—Rao algorithms respectively. Efficient reduction procedures for the input graph are proposed which accelerate these two algorithms substantially. Exact and heuristic shrinking conditions for the input graph are also given that yield efficient procedures for the identification of simple and general comb inequalities and of some elementary clique tree inequalities. These procedures constitute the core of a polytopal cutting plane algorithm that we have devised and programmed to solve a substantial number of large-scale problem instances with sizes up to 2392 nodes to optimality.


Mathematical Programming | 1990

An efficient algorithm for the minimum capacity cut problem

Manfred W. Padberg; Giovanni Rinaldi

Given a finite undirected graph with nonnegative edge capacities the minimum capacity cut problem consists of partitioning the graph into two nonempty sets such that the sum of the capacities of edges connecting the two parts is minimum among all possible partitionings. The standard algorithm to calculate a minimum capacity cut, due to Gomory and Hu (1961), runs in O(n4) time and is difficult to implement. We present an alternative algorithm with the same worst-case bound which is easier to implement and which was found empirically to be far superior to the standard algorithm. We report computational results for graphs with up to 2000 nodes.


Journal of Statistical Physics | 1995

Exact Ground States of Ising Spin Glasses: New Experimental Results With a Branch and Cut Algorithm

Moritz Diehl; C. De Simone; Michael Jünger; Petra Mutzel; Gerhard Reinelt; Giovanni Rinaldi

In this paper we study two-dimensional Ising spin glasses on a grid with nearest neighbor and periodic boundary interactions, based on a Gaussian bond distribution, and an exterior magnetic field. We show how using a technique called branch and cut, the exact ground states of grids of sizes up to 100×100 can be determined in a moderate amount of computation time, and we report on extensive computational tests. With our method we produce results based on more than 20,000 experiments on the properties of spin glasses whose errors depend only on the assumptions on the model and not on the computational process. This feature is a clear advantage of the method over other, more popular ways to compute the ground state, like Monte Carlo simulation including simulated annealing, evolutionary, and genetic algorithms, that provide only approximate ground states with a degree of accuracy that cannot be determineda priori. Our ground-state energy estimation at zero field is −1.317.


Mathematical Programming | 1993

The graphical relaxation: a new framework for the Symmetric Traveling Salesman Polytope

Denis Naddef; Giovanni Rinaldi

A present trend in the study of theSymmetric Traveling Salesman Polytope (STSP(n)) is to use, as a relaxation of the polytope, thegraphical relaxation (GTSP(n)) rather than the traditionalmonotone relaxation which seems to have attained its limits. In this paper, we show the very close relationship between STSP(n) and GTSP(n). In particular, we prove that every non-trivial facet of STSP(n) is the intersection ofn + 1 facets of GTSP(n),n of which are defined by the degree inequalities. This fact permits us to define a standard form for the facet-defining inequalities for STSP(n), that we calltight triangular, and to devise a proof technique that can be used to show that many known facet-defining inequalities for GTSP(n) define also facets of STSP(n). In addition, we give conditions that permit to obtain facet-defining inequalities by composition of facet-defining inequalities for STSP(n) and general lifting theorems to derive facet-defining inequalities for STSP(n +k) from inequalities defining facets of STSP(n).


Archive | 2003

Combinatorial optimization - Eureka, you shrink!

Michael Jünger; Gerhard Reinelt; Giovanni Rinaldi

properties of linear independence. Beautiful paper. And various people had gone on studying the properties of these algebraic structures. That’s the year Jack was born, right? 1935? Is that true? In 1935? 1932? Two years? Jack, you were born in 1935! (laughter) Yeah, that wouldn’t make sense, would it? I think it was 1935, but here’s what was interesting. (Jack: At that time, I worked on the subject, it was independently.) Good! Let’s say in the early 30s. Now, you see the thing about matroids was that they were this abstract structure, and I guess there were actually several people who somehow had this idea of optimizing over matroids with the greedy algorithm. And, Jack, of course, knew how to do that. But he was now beginning to strut his stuff with this new paradigm he had developed, which was the idea of polyhedral combinatorics. You see, in fact when Jack first described the matching polytope, he would write the degree constraints, then he would put the blossom inequalities in. There were a lot of people at that point who said “Ah! That’s really a bad idea”, and I remember people saying “You know Jack, they don’t make computers big enough to store all those inequalities.” And Jack said “Yes, but you don’t need to store them explicitly. You can generate them, and use them when you need them”. They said “I don’t get it”, so there was a bit of a problem on that point. But it was this idea that you could add this large set of inequalities to a small set of inequalities that define an integer program, and from there go on to the algorithm and to proving optimality, and the whole NP characterization idea which was in there. Now I guess the thing that to me was particularly remarkable about this was the stunning sequence of papers that followed including matroid partition, matroid intersection – much more complex combinatorial structures which came from this very simple notion of a matroid. Then, damned if Jack didn’t go ahead and solve the corresponding optimization problems. Now there was one problem that we always ran into with the whole matroid paradigm. People would say “Now, how do you know when a set is independent in a matroid?” and again get into these questions of defining oracles. At one point Jack said, again I wasn’t there at the time, but I can just imagine how Jack finally said it, “Well, I’ll give them something simple they can understand”, so in Mathematics of the Decision Sciences, there was a paper on what Jack called “branchings”. Optimum branchings was to be an accessible example of matroid intersection, so people would actually see this stuff working. And I do remember Jack telling me at a point later that he was a little disappointed that it took so many pages to write it down, even though it seemed so simple. But the branching thing came out as an elegant concrete example of matroid intersections. Another point that I would like to stress here, this whole idea of complexity of integer calculation was discovered by Jack. There’s a paper called “Systems of distinct representatives and linear algebra” which contains the famous line: “Gaussian elimination is not a good algorithm”. And, I can remember when I first read that line, I thought “I don’t get it. You clearly only have to perform a polynomial number of steps to reduce a matrix, so what’s the deal?” And of course what Jack had observed was that the numbers can get big. And the fact that you have to pay attention to the size of the numbers was something that “Eureka – You Shrink!” 7 again became extremely influential as people began studying these areas and going on from there. Partition matroids was another class of matroids Jack invented. It was beginning to get a buzz around it. In 1969, the University of Calgary hosted a massive combinatorial meeting, two weeks long, and basically everybody who was anybody in combinatorics at that time was invited. I had just finished my Master’s degree the week before that meeting was to take place, and my Master’s thesis actually was on matroids. What my supervisor Eric Milner had done is given me stacks of papers to read on matroids. Bob, you and I are probably the only people who tried to read the Tutte paper and I think you made it further than I did. But there was all this excitement about matroids and all these matroid optimization results were coming from Jack. That was one of the reasons why they invited Jack to come and give a series of lectures at this big Calgary meeting on his research. And, in some sense, you can say this was a coming-out party. It was an endorsement of Jack’s views, it gave him a platform to present this material. It was an amazing series of lectures because I can remember that, at this point, Jack had developed the whole idea of submodular functions and polymatroids. This was matroids generalized to a much broader concept, and every so often he would throw in one of those combinatorial realizations. As someone who sat and took notes on those lectures, I remember I was exhausted at the end just trying to keep up with it. So when I think of the sort of work that Jack did in that stage, not only did he define and promote this whole concept of good algorithms and good characterizations, he solved the matching problem, he did the same for matroid intersections, he did the same for submodular functions, he laid the basis for what became arithmetic complexity. I think at one point I said to Jack: “How come you got there when all the good stuff was there?”. But I think it’s probably fairer to say that all this good stuff was there because Jack made it there. He had this vision of what could be done, he could see how it could go, and for those of you who just met him, you may not be aware of it, maybe I am the only one who has noticed this: Jack has a slightly stubborn streak. I think that served Jack incredibly well, as he created this whole agenda describing where the world of combinatorial optimization should go. I think it’s been remarkably successful. Now, what we want to do at this point is to have a few comments by some of the people who knew Jack somewhat earlier in his career. I think, Jack, the technical term for this is a “roast”. At this point, Ellis Johnson, George Nemhauser, Bob Bixby, Jean-François Maurras, Denis Naddef, and Kathie Cameron successively joined Bill Pulleyblank and all presented reminiscences and a number of entertaining stories, accompanied by lot of laughter and interaction with the audience. Finally, Jack Edmonds took the stage. We cite from his speech: I should say that I didn’t expect to give this talk tonight. These guys, I saw a session for Edmonds on the board, and then they said to me, 8 Surprise Session for Jack Edmonds Fig. 5. “I remember . . .” Fig. 6. Jack Edmonds “Hey, would you give a few words about the early days”. So here I am now. (laughter) It occurred to me that “Yeah, sure, their idea of a special session for Jack is to start him talking.” (laughter) And talk he did, sharing his personal recollections of his early career with the audience. We recommend “A Glimpse of Heaven” [6] which contains Jack’s reminiscences on this period of his life, for those readers who were not at Aussois to enjoy it live. “Eureka – You Shrink!” 9 The session closed with a standing ovation for Jack and a presentation to Jack: a poster brought to Aussois by Jean-François Maurras showing Jack in the 70s and signed by the participants of Aussois 2001. Fig. 7. Jack Edmonds in the 70s


Mathematical Programming | 1991

The symmetric traveling salesman polytope and its graphical relaxation: composition of valid inequalities

Denis Naddef; Giovanni Rinaldi

The graphical relaxation of the Traveling Salesman Problem is the relaxation obtained by requiring that the salesman visit each city at least once instead of exactly once. This relaxation has already led to a better understanding of the Traveling Salesman polytope in Cornuéjols, Fonlupt and Naddef (1985). We show here how one can compose facet-inducing inequalities for the graphical traveling salesman polyhedron, and obtain other facet-inducing inequalities. This leads to new valid inequalities for the Symmetric Traveling Salesman polytope. This paper is the first of a series of three papers on the Symmetric Traveling Salesman polytope, the next one studies the strong relationship between that polytope and its graphical relaxation, and the last one applies all the theoretical developments of the two first papers to prove some new facet-inducing results.


Handbooks in Operations Research and Management Science | 1995

Chapter 4 The traveling salesman problem

Michael Jünger; Gerhard Reinelt; Giovanni Rinaldi

Publisher Summary The traveling salesman problem, TSP for short, has model character in many branches of mathematics, computer science, and operations research. Heuristics, linear programming, and branch and bound, which are still the main components of todays most successful approaches to hard combinatorial optimization problems, were first formulated for the TSP and used to solve practical problem instances. When the theory of NP-completeness developed, the TSP was one of the first problems to be proven NP-hard by Karp in 1972. New algorithmic techniques have first been developed for or at least have been applied to the TSP to show their effectiveness. Examples are branch and bound, Lagrangean relaxation, Lin-Kernighan type methods, simulated annealing, and the field of polyhedral combinatorics for hard combinatorial optimization problems. The chapter presents a self-contained introduction into algorithmic and computational aspects of the traveling salesman problem along with their theoretical prerequisites as seen from the point of view of an operations researcher who wants to solve practical instances. The chapter provides guidelines to explain how to attack a TSP instance depending on its size, its structural properties (e.g., metric), the available computation time, and the desired quality of the solution.

Collaboration


Dive into the Giovanni Rinaldi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrea Lodi

École Polytechnique de Montréal

View shared research outputs
Top Co-Authors

Avatar

Laura Palagi

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar

Laurence A. Wolsey

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Petra Mutzel

Technical University of Dortmund

View shared research outputs
Researchain Logo
Decentralizing Knowledge