Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andreas Lundell is active.

Publication


Featured researches published by Andreas Lundell.


Journal of Global Optimization | 2009

Some transformation techniques with applications in global optimization

Andreas Lundell; Joakim Westerlund; Tapio Westerlund

In this paper some transformation techniques, based on power transformations, are discussed. The techniques can be applied to solve optimization problems including signomial functions to global optimality. Signomial terms can always be convexified and underestimated using power transformations on the individual variables in the terms. However, often not all variables need to be transformed. A method for minimizing the number of original variables involved in the transformations is, therefore, presented. In order to illustrate how the given method can be integrated into the transformation framework, some mixed integer optimization problems including signomial functions are finally solved to global optimality using the given techniques.


Optimization Methods & Software | 2009

Convex underestimation strategies for signomial functions

Andreas Lundell; Tapio Westerlund

Different types of underestimation strategies are used in deterministic global optimization. In this paper, convexification and underestimation techniques applicable to problems containing signomial functions are studied. Especially, power transformation and exponential transformation (ET) will be considered in greater detail and some new theoretical results regarding the relation between the negative power transformation and the ET are given. The techniques are, furthermore, illustrated through examples and compared with other underestimating methods used in global optimization solvers such as αBB and BARON.


Journal of Global Optimization | 2013

A reformulation framework for global optimization

Andreas Lundell; Anders Skjäl; Tapio Westerlund

In this paper, we present a global optimization method for solving nonconvex mixed integer nonlinear programming (MINLP) problems. A convex overestimation of the feasible region is obtained by replacing the nonconvex constraint functions with convex underestimators. For signomial functions single-variable power and exponential transformations are used to obtain the convex underestimators. For more general nonconvex functions two versions of the so-called αBB-underestimator, valid for twice-differentiable functions, are integrated in the actual reformulation framework. However, in contrast to what is done in branch-and-bound type algorithms, no direct branching is performed in the actual algorithm. Instead a piecewise convex reformulation is used to convexify the entire problem in an extended variable-space, and the reformulated problem is then solved by a convex MINLP solver. As the piecewise linear approximations are made finer, the solution to the convexified and overestimated problem will form a converging sequence towards a global optimal solution. The result is an easily-implementable algorithm for solving a very general class of optimization problems.


Archive | 2012

Global Optimization of Mixed-Integer Signomial Programming Problems

Andreas Lundell; Tapio Westerlund

Described in this chapter, is a global optimization algorithm for mixedinteger nonlinear programming problems containing signomial functions. The method obtains a convex relaxation of the nonconvex problem through reformulations using single-variable transformations in combination with piecewise linear approximations of the inverse transformations. The solution of the relaxed problems converges to the global optimal solution as the piecewise linear approximations are improved iteratively. To illustrate how the algorithm can be used to solve problems to global optimality, a numerical example is also included.


Journal of Global Optimization | 2016

The extended supporting hyperplane algorithm for convex mixed-integer nonlinear programming

Jan Kronqvist; Andreas Lundell; Tapio Westerlund

A new deterministic algorithm for solving convex mixed-integer nonlinear programming (MINLP) problems is presented in this paper: The extended supporting hyperplane (ESH) algorithm uses supporting hyperplanes to generate a tight overestimated polyhedral set of the feasible set defined by linear and nonlinear constraints. A sequence of linear or quadratic integer-relaxed subproblems are first solved to rapidly generate a tight linear relaxation of the original MINLP problem. After an initial overestimated set has been obtained the algorithm solves a sequence of mixed-integer linear programming or mixed-integer quadratic programming subproblems and refines the overestimated set by generating more supporting hyperplanes in each iteration. Compared to the extended cutting plane algorithm ESH generates a tighter overestimated set and unlike outer approximation the generation point for the supporting hyperplanes is found by a simple line search procedure. In this paper it is proven that the ESH algorithm converges to a global optimum for convex MINLP problems. The ESH algorithm is implemented as the supporting hyperplane optimization toolkit (SHOT) solver, and an extensive numerical comparison of its performance against other state-of-the-art MINLP solvers is presented.


Computer-aided chemical engineering | 2009

Optimization of Transformations for Convex Relaxations of MINLP Problems Containing Signomial Functions

Andreas Lundell; Tapio Westerlund

Abstract In this paper, a method for determining an optimized set of transformations for sig-nomial functions in a nonconvex mixed integer nonlinear programming (MINLP) problem is described. Through the proposed mixed integer linear programming (MILP) problem formulation, a set of single-variable transformations is obtained. By varying the parameters in the MILP problem, different sets of transformations are obtained. Using these transformations and some approximation techniques, a nonconvex MINLP problem can be transformed into a convex overestimated form. What transformations are used have a direct effect on the combinatorial complexity and approximation quality of these problems, so it is of great importance to find the best possible transformations. Variants of the method have previously been presented in Lundell et al. (2007) and Lundell and Westerlund (2008). Here, the scope of the procedure is extended to also allow for minimization of the number of required transformation variables, as well as, favor transformations with better numerical properties. These improvements can have a significant impact on the computational effort needed when solving the transformed MINLP problems.


integration of ai and or techniques in constraint programming | 2013

Improved Discrete Reformulations for the Quadratic Assignment Problem

Axel Nyberg; Tapio Westerlund; Andreas Lundell

This paper presents an improved as well as a completely new version of a mixed integer linear programming (MILP) formulation for solving the quadratic assignment problem (QAP) to global optimum. Both formulations work especially well on instances where at least one of the matrices is sparse. Modification schemes, to decrease the number of unique elements per row in symmetric instances, are presented as well. The modifications will tighten the presented formulations and considerably shorten the computational times. We solved, for the first time ever to proven optimality, the instance esc32b from the quadratic assignment problem library, QAPLIB.


Computer-aided chemical engineering | 2009

Implementation of a Convexification Technique for Signomial Functions

Andreas Lundell; Tapio Westerlund

Abstract In this paper, an implementation of a global optimization framework for Mixed Integer Nonlinear Programming (MINLP) problems containing signomial functions is described. In the implementation, the global optimal solution to a MINLP problem is found by solving a sequence of convex relaxed subproblems overestimating the original problem. The described solver utilizes the General Algebraic Modeling System (GAMS) to solve the subproblems, which in each iteration are made tighter until the global optimal solution is found.


Computer-aided chemical engineering | 2012

Finding an optimized set of transformations for convexifying nonconvex MINLP problems

Andreas Lundell; Tapio Westerlund

In this paper we describe a method for obtaining sets of transformations for reformulating a mixed integer nonlinear programming (MINLP) problem containing nonconvex twice-differentiable (C2) functions to a convex MINLP problem in an extended variable space. The method for obtaining the transformations is based on solving a mixed integer linear programming (MILP) problem given the structure of the nonconvex MINLP problem. The solution of the MILP problem renders a minimal set of transformations convexifying the nonconvex problem. This technique is implemented as an part of the a signomial global optimization algorithm (αSGO), a global optimization algorithm for nonconvex MINLP problems.


Journal of Global Optimization | 2018

Reformulations for utilizing separability when solving convex MINLP problems

Jan Kronqvist; Andreas Lundell; Tapio Westerlund

Several deterministic methods for convex mixed integer nonlinear programming generate a polyhedral approximation of the feasible region, and utilize this approximation to obtain trial solutions. Such methods are, e.g., outer approximation, the extended cutting plane method and the extended supporting hyperplane method. In order to obtain the optimal solution and verify global optimality, these methods often require a quite accurate polyhedral approximation. In case the nonlinear functions are convex and separable to some extent, it is possible to obtain a tighter approximation by using a lifted polyhedral approximation, which can be achieved by reformulating the problem. We prove that under mild assumptions, it is possible to obtain tighter linear approximations for a type of functions referred to as almost additively separable. Here it is also shown that solvers, by a simple reformulation, can benefit from the tighter approximation, and a numerical comparison demonstrates the potential of the reformulation. The reformulation technique can also be combined with other known transformations to make it applicable to some nonseparable convex functions. By using a power transform and a logarithmic transform the reformulation technique can for example be applied to p-norms and some convex signomial functions, and the benefits of combining these transforms with the reformulation technique are illustrated with some numerical examples.

Collaboration


Dive into the Andreas Lundell's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Axel Nyberg

Åbo Akademi University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge