Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andreas Wächter is active.

Publication


Featured researches published by Andreas Wächter.


Mathematical Programming | 2006

On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming

Andreas Wächter; Lorenz T. Biegler

Abstract.We present a primal-dual interior-point algorithm with a filter line-search method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration phase for the filter method, second-order corrections, and inertia correction of the KKT matrix. Heuristics are also considered that allow faster performance. This method has been implemented in the IPOPT code, which we demonstrate in a detailed numerical study based on 954 problems from the CUTEr test set. An evaluation is made of several line-search options, and a comparison is provided with two state-of-the-art interior-point codes for nonlinear programming.


Discrete Optimization | 2008

An algorithmic framework for convex mixed integer nonlinear programs

Pierre Bonami; Lorenz T. Biegler; Andrew R. Conn; Gérard Cornuéjols; Ignacio E. Grossmann; Carl D. Laird; Jon Lee; Andrea Lodi; François Margot; Nicolas W. Sawaya; Andreas Wächter

This paper is motivated by the fact that mixed integer nonlinear programming is an important and difficult area for which there is a need for developing new methods and software for solving large-scale problems. Moreover, both fundamental building blocks, namely mixed integer linear programming and nonlinear programming, have seen considerable and steady progress in recent years. Wishing to exploit expertise in these areas as well as on previous work in mixed integer nonlinear programming, this work represents the first step in an ongoing and ambitious project within an open-source environment. COIN-OR is our chosen environment for the development of the optimization software. A class of hybrid algorithms, of which branch-and-bound and polyhedral outer approximation are the two extreme cases, are proposed and implemented. Computational results that demonstrate the effectiveness of this framework are reported. Both the library of mixed integer nonlinear problems that exhibit convex continuous relaxations, on which the experiments are carried out, and a version of the software used are publicly available.


Optimization Methods & Software | 2009

Branching and bounds tighteningtechniques for non-convex MINLP

Pietro Belotti; Jon Lee; Leo Liberti; François Margot; Andreas Wächter

Many industrial problems can be naturally formulated using mixed integer non-linear programming (MINLP) models and can be solved by spatial Branch&Bound (sBB) techniques. We study the impact of two important parts of sBB methods: bounds tightening (BT) and branching strategies. We extend a branching technique originally developed for MILP, reliability branching, to the MINLP case. Motivated by the demand for open-source solvers for real-world MINLP problems, we have developed an sBB software package named couenne (Convex Over- and Under-ENvelopes for Non-linear Estimation) and used it for extensive tests on several combinations of BT and branching techniques on a set of publicly available and real-world MINLP instances. We also compare the performance of couenne with a state-of-the-art MINLP solver.


Chemical Engineering Science | 2002

Advances in simultaneous strategies for dynamic process optimization

Lorenz T. Biegler; Arturo M. Cervantes; Andreas Wächter

Abstract Following on the popularity of dynamic simulation for process systems, dynamic optimization has been identified as an important task for key process applications. In this study, we present an improved algorithm for simultaneous strategies for dynamic optimization. This approach addresses two important issues for dynamic optimization. First, an improved nonlinear programming strategy is developed based on interior point methods. This approach incorporates a novel filter-based line search method as well as preconditioned conjugate gradient method for computing search directions for control variables. This leads to a significant gain in algorithmic performance. On a dynamic optimization case study, we show that nonlinear programs (NLPs) with over 800,000 variables can be solved in less than 67 CPU minutes. Second, we address the problem of moving finite elements through an extension of the interior point strategy. With this strategy we develop a reliable and efficient algorithm to adjust elements to track optimal control profile breakpoints and to ensure accurate state and control profiles. This is demonstrated on a dynamic optimization for two distillation columns. Finally, these algorithmic improvements allow us to consider a broader set of problem formulations that require dynamic optimization methods. These topics and future trends are outlined in the last section.


Siam Journal on Optimization | 2005

Line Search Filter Methods for Nonlinear Programming: Motivation and Global Convergence

Andreas Wächter; Lorenz T. Biegler

Line search methods are proposed for nonlinear programming using Fletcher and Leyffers filter method [Math. Program., 91 (2002), pp. 239--269], which replaces the traditional merit function. Their global convergence properties are analyzed. The presented framework is applied to active set sequential quadratic programming (SQP) and barrier interior point algorithms. Under mild assumptions it is shown that every limit point of the sequence of iterates generated by the algorithm is feasible, and that there exists at least one limit point that is a stationary point for the problem under consideration. A new alternative filter approach employing the Lagrangian function instead of the objective function with identical global convergence properties is briefly discussed.


Siam Journal on Optimization | 2002

Global Convergence of a Trust-Region SQP-Filter Algorithm for General Nonlinear Programming

Roger Fletcher; Nicholas I. M. Gould; Sven Leyffer; Philippe L. Toint; Andreas Wächter

A trust-region SQP-filter algorithm of the type introduced by Fletcher and Leyffer [Math. Program., 91 (2002), pp. 239--269] that decomposes the step into its normal and tangential components allows for an approximate solution of the quadratic subproblem and incorporates the safeguarding tests described in Fletcher, Leyffer, and Toint [On the Global Convergence of an SLP-Filter Algorithm, Technical Report 98/13, Department of Mathematics, University of Namur, Namur, Belgium, 1998; On the Global Convergence of a Filter-SQP Algorithm, Technical Report 00/15, Department of Mathematics, University of Namur, Namur, Belgium, 2000] is considered. It is proved that, under reasonable conditions and for every possible choice of the starting point, the sequence of iterates has at least one first-order critical accumulation point.


Siam Journal on Optimization | 2005

Line Search Filter Methods for Nonlinear Programming: Local Convergence

Andreas Wächter; Lorenz T. Biegler

A line search method is proposed for nonlinear programming using Fletcher and Leyffers filter method, which replaces the traditional merit function. A simple modification of the method proposed in a companion paper [SIAM J. Optim., 16 (2005), pp. 1--31] introducing second order correction steps is presented. It is shown that the proposed method does not suffer from the Maratos effect, so that fast local convergence to second order sufficient local solutions is achieved.


Computational Optimization and Applications | 2007

Matching-based preprocessing algorithms to the solution of saddle-point problems in large-scale nonconvex interior-point optimization

Olaf Schenk; Andreas Wächter; Michael Hagemann

Abstract Interior-point methods are among the most efficient approaches for solving large-scale nonlinear programming problems. At the core of these methods, highly ill-conditioned symmetric saddle-point problems have to be solved. We present combinatorial methods to preprocess these matrices in order to establish more favorable numerical properties for the subsequent factorization. Our approach is based on symmetric weighted matchings and is used in a sparse direct LDLT factorization method where the pivoting is restricted to static supernode data structures. In addition, we will dynamically expand the supernode data structure in cases where additional fill-in helps to select better numerical pivot elements. This technique can be seen as an alternative to the more traditional threshold pivoting techniques. We demonstrate the competitiveness of this approach within an interior-point method on a large set of test problems from the CUTE and COPS sets, as well as large optimal control problems based on partial differential equations. The largest nonlinear optimization problem solved has more than 12 million variables and 6 million constraints.


Computers & Chemical Engineering | 2000

A reduced space interior point strategy for optimization of differential algebraic systems

Arturo M. Cervantes; Andreas Wächter; Reha H. Tütüncü; Lorenz T. Biegler

Abstract A novel nonlinear programming (NLP) strategy is developed and applied to the optimization of differential algebraic equation (DAE) systems. Such problems, also referred to as dynamic optimization problems, are common in process engineering and remain challenging applications of nonlinear programming. These applications often consist of large, complex nonlinear models that result from discretizations of DAEs. Variables in the NLP include state and control variables, with far fewer control variables than states. Moreover, all of these discretized variables have associated upper and lower bounds that can be potentially active. To deal with this large, highly constrained problem, an interior point NLP strategy is developed. Here a log barrier function is used to deal with the large number of bound constraints in order to transform the problem to an equality constrained NLP. A modified Newton method is then applied directly to this problem. In addition, this method uses an efficient decomposition of the discretized DAEs and the solution of the Newton step is performed in the reduced space of the independent variables. The resulting approach exploits many of the features of the DAE system and is performed element by element in a forward manner. Several large dynamic process optimization problems are considered to demonstrate the effectiveness of this approach, these include complex separation and reaction processes (including reactive distillation) with several hundred DAEs. NLP formulations with over 55 000 variables are considered. These problems are solved in 5–12 CPU min on small workstations.


Mathematical Programming | 2000

Failure of global convergence for a class of interior point methods for nonlinear programming

Andreas Wächter; Lorenz T. Biegler

Abstract.Using a simple analytical example, we demonstrate that a class of interior point methods for general nonlinear programming, including some current methods, is not globally convergent. It is shown that those algorithms produce limit points that are neither feasible nor stationary points of some measure of the constraint violation, when applied to a well-posed problem.

Collaboration


Dive into the Andreas Wächter's collaboration.

Top Co-Authors

Avatar

Lorenz T. Biegler

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Jon Lee

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pierre Bonami

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar

François Margot

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge