Publication


Featured researches published by Tamás Terlaky.


Discrete Applied Mathematics | 1997

A Monotonic Build-Up Simplex Algorithm for Linear Programming

Joost P. Warners; Tamás Terlaky; C. Roos; Benjamin Jansen

\sqrt n


Archive | 1993

Some generalizations of the criss-cross method for quadratic programming

Benjamin Jansen; C. Roos; Tamás Terlaky; J.-Ph. Vial


Mathematics of Operations Research | 2010

Central Path Curvature and Iteration-Complexity for Redundant Klee—Minty Cubes

Miguel A. Goberna; Tamás Terlaky; Maxim I. Todorov

∣logε∣) orO((1+M2)n∣logε∣), depending on the updating scheme for the lower bound.


Discrete Applied Mathematics | 2008

How good are interior point methods? Klee–Minty cubes tighten iteration-complexity bounds

Tamás Terlaky; Anthony Vannelli; Hu Zhang

Recently the authors introduced the notions of self-regular functions and self-regular proximity functions and used them in the design and analysis of interior-point methods (IPMs) for linear and semidefinite optimization (LO and SDO). In this paper, we consider an extension of these concepts to second-order conic optimization (SOCO). This nontrivial extension requires the development of various new tools. Versatile properties of general analytical functions associated with the second-order cone are exploited. Based on the so-called self-regular proximity functions, new primal-dual Newton methods for solving SOCO problems are proposed. It will be shown that these new large-update IPMs for SOCO enjoy polynomial


Linear Algebra and its Applications | 1991

A potential reduction approach to the frequency assignment problem

Emil Klafszky; Tamás Terlaky

{\cal{O}}({\max\{p,q\} N^{(q+1)/{2q}}\log \frac{N}{\e}})


Journal of Global Optimization | 2010

Interior-Point Methodology for Linear Programming: Duality, Sensitivity Analysis and Computational Aspects

Tibor Illés; Marianna Nagy; Tamás Terlaky

iteration bounds analogous to those of their LO and SDO cousins, where N is the number of constraining cones and p,q are constants, the so-called growth degree and barrier degree of the corresponding proximity. Our analysis allows us to choose not only a constant q but also a q as large as logN. In this case, our new algorithm has the best known


European Journal of Operational Research | 2001

Sensitivity Analysis in Linear Semi-Infinite Programming via Partitions

Tamás Terlaky

{\cal{O}}({N^{1/2}\log{N}\log\frac{N}{\e}})


Mathematical Programming | 1997

On routing in VLSI design and communication networks

Benjamin Jansen; Kees Roos; Tamás Terlaky; Akiko Yoshise

iteration bound for large-update IPMs.


Archive | 1996

The role of pivoting in proving some fundamental theorems of linear algebra

Benjamin Jansen; C. Roos; Tamás Terlaky

In this paper, we propose a method for linear programming with the property that, starting from an initial non-central point, it generates iterates that simultaneously get closer to optimality and closer to centrality. The iterates follow paths that in the limit are tangential to the central path. Together with the convergence analysis, we provide a general framework which enables us to analyze various primal-dual algorithms in the literature in a short and uniform way.


Discrete and Computational Geometry | 2009

A polynomial path-following interior point algorithm for general linear complementarity problems

Antoine Deza; Tamás Terlaky; Yuriy Zinchenko

Abstract We propose a new class of primal–dual methods for linear optimization (LO). By using some new analysis tools, we prove that the large-update method for LO based on the new search direction has a polynomial complexity of O(n4/(4+ρ)log(n/e)) iterations, where ρ∈[0,2] is a parameter used in the system defining the search direction. If ρ=0, our results reproduce the well-known complexity of the standard primal–dual Newton method for LO. At each iteration, our algorithm needs only to solve a linear equation system. An extension of the algorithms to semidefinite optimization is also presented.

Researchain Logo
Decentralizing Knowledge