Jean-Louis Goffin
McGill University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jean-Louis Goffin.
Optimization Methods & Software | 2002
Jean-Louis Goffin; Jean-Philippe Vial
We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a self-contained convergence analysis that uses the formalism of the theory of self-concordant functions, but for the main results, we give direct proofs based on the properties of the logarithmic function. We also provide an in-depth analysis of two extensions that are very relevant to practical problems: the case of multiple cuts and the case of deep cuts. We further examine extensions to problems including feasible sets partially described by an explicit barrier function, and to the case of nonlinear cuts. Finally, we review several implementation issues and discuss some applications.
Mathematics of Operations Research | 1980
Jean-Louis Goffin
The relaxation method for solving systems of inequalities is related both to subgradient optimization and to the relaxation methods used in numerical analysis. The convergence theory depends upon two condition numbers. The first one is used mostly for the study of the rate of geometric convergence. The second is used to define a range of values of the relaxation parameter which guarantees finite convergence. In the case of obtuse polyhedra, finite convergence occurs for any value of the relaxation parameter between one and two. Various relationships between the condition numbers and the concept of obtuseness are established.
Mathematical Programming | 1977
Jean-Louis Goffin
Rates of convergence of subgradient optimization are studied. If the step size is chosen to be a geometric progression with ratioρ the convergence, if it occurs, is geometric with rateρ. For convergence to occur, it is necessary that the initial step size be large enough, and that the ratioρ be greater than a sustainable ratez(μ), which depends upon a condition numberμ, defined for both differentiable and nondifferentiable functions. The sustainable ratez(μ) is closely related to the rate of convergence of the steepest ascent method for differentiable functions: in fact it is identical if the function is not too well conditioned.
Siam Journal on Optimization | 1996
Jean-Louis Goffin; Zhi-Quan Luo; Yinyu Ye
We further analyze the convergence and the complexity of a dual column generation algorithm for solving general convex feasibility problems defined by a separation oracle. The oracle is called at an approximate analytic center of the set given by the intersection of the linear inequalities which are the previous answers of the oracle. We show that the algorithm converges in finite time and is in fact a fully polynomial approximation algorithm, provided that the feasible region has a nonempty interior.
Mathematical Programming | 1997
Jean-Louis Goffin; Jacek Gondzio; Robert Sarkissian; Jean-Philippe Vial
The paper deals with nonlinear multicommodity flow problems with convex costs. A decomposition method is proposed to solve them. The approach applies a potential reduction algorithm to solve the master problem approximately and a column generation technique to define a sequence of primal linear programming problems. Each subproblem consists of finding a minimum cost flow between an origin and a destination node in an uncapacited network. It is thus formulated as a shortest path problem and solved with Dijkstra’s d-heap algorithm. An implementation is described that takes full advantage of the supersparsity of the network in the linear algebra operations. Computational results show the efficiency of this approach on well-known nondifferentiable problems and also large scale randomly generated problems (up to 1000 arcs and 5000 commodities).
Journal of Optimization Theory and Applications | 1990
Jean-Louis Goffin; Â Jean-Philippe Vial
The problem studied is that of solving linear programs defined recursively by column generation techniques or cutting plane techniques using, respectively, the primal projective method or the dual projective method.
Mathematical Programming | 1999
Jean-Louis Goffin; Krzysztof C. Kiwiel
We study the subgradient projection method for convex optimization with Brannlunds level control for estimating the optimal value. We establish global convergence in objective values without additional assumptions employed in the literature.
Siam Journal on Optimization | 2000
Jean-Louis Goffin; Jean-Philippe Vial
We analyze the multiple cut generation scheme in the analytic center cutting plane method. We propose an optimal primal and dual updating direction when the cuts are central. The direction is optimal in the sense that it maximizes the product of the new dual slacks and of the new primal variables within the trust regions defined by Dikins primal and dual ellipsoids. The new primal and dual directions use the variance-covariance matrix of the normals to the new cuts in the metric given by Dikins ellipsoid. We prove that the recovery of a new analytic center from the optimal restoration direction can be done in O(plog (p+1)) damped Newton steps, where p is the number of new cuts added by the oracle, which may vary with the iteration. The results and the proofs are independent of the specific scaling matrix---primal, dual, or primal-dual---that is used in the computations. The computation of the optimal direction uses Newtons method applied to a self-concordant function of p variables. The convergence result of [Ye, Math. Programming, 78 (1997), pp. 85--104] holds here also: the algorithm stops after
Mathematical Programming | 2004
Samir Elhedhli; Jean-Louis Goffin
O^*(\frac{\bar p^2n^2}{\varepsilon^2})
Mathematical Programming | 1999
Jean-Louis Goffin; Jean-Philippe Vial
cutting planes have been generated, where