M. El Ghami
University of Bergen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by M. El Ghami.
Siam Journal on Optimization | 2005
Yanqin Bai; M. El Ghami; C. Roos
Recently, so-called self-regular barrier functions for primal-dual interior-point methods (IPMs) for linear optimization were introduced. Each such barrier function is determined by its (univariate) self-regular kernel function. We introduce a new class of kernel functions. The class is defined by some simple conditions on the kernel function and its derivatives. These properties enable us to derive many new and tight estimates that greatly simplify the analysis of IPMs based on these kernel functions. In both the algorithm and its analysis we use a single neighborhood of the central path; the neighborhood naturally depends on the kernel function. An important conclusion is that inverse functions of suitable restrictions of the kernel function and its first derivative more or less determine the behavior of the corresponding IPMs. Based on the new estimates we present a simple and unified computational scheme for the complexity analysis of kernel function in the new class. We apply this scheme to seven specific kernel functions. Some of these functions are self-regular, and others are not. One of the functions differs from the others, and from all self-regular functions, in the sense that its growth term is linear. Iteration bounds for both large- and small-update methods are derived. It is shown that small-update methods based on the new kernel functions all have the same complexity as the classical primal-dual IPM, namely,
Siam Journal on Optimization | 2002
Yanqin Bai; M. El Ghami; C. Roos
O(\sqrt{n}\log\frac{n}{\e})
Journal of Computational and Applied Mathematics | 2012
M. El Ghami; Z. A. Guennoun; S. Bouali; Trond Steihaug
. For large-update methods the best obtained bound is
Optimization Methods & Software | 2002
Yanqin Bai; C. Roos; M. El Ghami
O(\sqrt{n}(\log n)\log\frac{n}{\e})
Rairo-operations Research | 2008
M. El Ghami; C. Roos
, which until now has been the best known bound for such methods.
Rairo-operations Research | 2009
M. El Ghami; Y. Q. Bai; C. Roos
We introduce a new barrier-type function which is not a barrier function in the usual sense: it has finite value at the boundary of the feasible region. Despite this, the iteration bound of a large-update interior-point method based on this function is shown to be
international symposium on computers and communications | 2008
M. El Ghami; I. Ivanov; H. Melissen; C. Roos; Trond Steihaug
O({\sqrt{n}\,({\rm log}\,n)\,{\rm log}\,\frac{n}{\varepsilon}})
Journal of Optimization Theory and Applications | 2008
Y. Q. Bai; Goran Lesaja; C. Roos; G.Q. Wang; M. El Ghami
, which is as good as the currently best known bound for large-update methods. The recently introduced property of \emph{exponential convexity} for the kernel function underlying the barrier function, as well as the strong convexity of the kernel function, are crucial in the analysis.
Journal of Computational and Applied Mathematics | 2009
M. El Ghami; I. Ivanov; J.B.M. Melissen; C. Roos; Trond Steihaug
In this paper, we present a new barrier function for primal-dual interior-point methods in linear optimization. The proposed kernel function has a trigonometric barrier term. It is shown that in the interior-point methods based on this function for large-update methods, the iteration bound is improved significantly. For small-update interior-point methods, the iteration bound is the best currently known bound for primal-dual interior-point methods.
Rairo-operations Research | 2010
M. El Ghami; Trond Steihaug
In this article we present a generic primal-dual interior-point algorithm for linear optimization in which the search direction depends on a univariate kernel function which is also used as proximity measure in the analysis of the algorithm. We present some powerful tools for the analysis of the algorithm under the assumption that the kernel function satisfies three easy to check and mild conditions (i.e., exponential convexity, superconvexity and monotonicity of the second derivative). The approach is demonstrated by introducing a new kernel function and showing that the corresponding large-update algorithm improves the iteration complexity with a factor n 1/4 when compared with the classical method, which is based on the use of the logarithmic barrier function.