Mariya Ishteva
Vrije Universiteit Brussel
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mariya Ishteva.
SIAM Journal on Matrix Analysis and Applications | 2011
Mariya Ishteva; Pierre-Antoine Absil; Sabine Van Huffel; Lieven De Lathauwer
Higher-order tensors are used in many application fields, such as statistics, signal processing, and scientific computing. Efficient and reliable algorithms for manipulating these multi-way arrays are thus required. In this paper, we focus on the best rank-
Knowledge and Information Systems | 2014
Ramakrishnan Kannan; Mariya Ishteva; Haesun Park
(R_1,R_2,R_3)
Numerical Algorithms | 2009
Mariya Ishteva; Lieven De Lathauwer; Pierre-Antoine Absil; Sabine Van Huffel
approximation of third-order tensors. We propose a new iterative algorithm based on the trust-region scheme. The tensor approximation problem is expressed as a minimization of a cost function on a product of three Grassmann manifolds. We apply the Riemannian trust-region scheme, using the truncated conjugate-gradient method for solving the trust-region subproblem. Making use of second order information of the cost function, superlinear convergence is achieved. If the stopping criterion of the subproblem is chosen adequately, the local convergence rate is quadratic. We compare this new method with the well-known higher-order orthogonal iteration method and discuss the advantages over Newton-type methods.
SIAM Journal on Matrix Analysis and Applications | 2013
Mariya Ishteva; Pierre-Antoine Absil; Paul Van Dooren
Matrix factorization has been widely utilized as a latent factor model for solving the recommender system problem using collaborative filtering. For a recommender system, all the ratings in the rating matrix are bounded within a pre-determined range. In this paper, we propose a new improved matrix factorization approach for such a rating matrix, called Bounded Matrix Factorization (BMF), which imposes a lower and an upper bound on every estimated missing element of the rating matrix. We present an efficient algorithm to solve BMF based on the block coordinate descent method. We show that our algorithm is scalable for large matrices with missing elements on multicore systems with low memory. We present substantial experimental results illustrating that the proposed method outperforms the state of the art algorithms for recommender system such as stochastic gradient descent, alternating least squares with regularization, SVD++ and Bias-SVD on real-world datasets such as Jester, Movielens, Book crossing, Online dating and Netflix.
Neural Computation | 2009
Pierre-Antoine Absil; Mariya Ishteva; L. De Lathauwer; S. Van Huffel
An increasing number of applications are based on the manipulation of higher-order tensors. In this paper, we derive a differential-geometric Newton method for computing the best rank-(R1, R2, R3) approximation of a third-order tensor. The generalization to tensors of order higher than three is straightforward. We illustrate the fast quadratic convergence of the algorithm in a neighborhood of the solution and compare it with the known higher-order orthogonal iteration (De Lathauwer et al., SIAM J Matrix Anal Appl 21(4):1324–1342, 2000). This kind of algorithms are useful for many problems.
SIAM Journal on Matrix Analysis and Applications | 2015
Philippe Dreesen; Mariya Ishteva; Johan Schoukens
The problem discussed in this paper is the symmetric best low multilinear rank approximation of third-order symmetric tensors. We propose an algorithm based on Jacobi rotations, for which symmetry is preserved at each iteration. Two numerical examples are provided indicating the need for such algorithms. An important part of the paper consists of proving that our algorithm converges to stationary points of the objective function. This can be considered an advantage of the proposed algorithm over existing symmetry-preserving algorithms in the literature.
Recent Advances in Optimization and its applications in engineering (invited overview paper) | 2010
Mariya Ishteva; Pierre-Antoine Absil; Sabine Van Huffel; Lieven De Lathauwer
Newtons method for solving the matrix equation runs up against the fact that its zeros are not isolated. This is due to a symmetry of F by the action of the orthogonal group. We show how differential-geometric techniques can be exploited to remove this symmetry and obtain a geometric Newton algorithm that finds the zeros of F. The geometric Newton method does not suffer from the degeneracy issue that stands in the way of the original Newton method.
international conference on data mining | 2012
Ramakrishnan Kannan; Mariya Ishteva; Haesun Park
We present a method to decompose a set of multivariate real polynomials into linear combinations of univariate polynomials in linear forms of the input variables. The method proceeds by collecting the first-order information of the polynomials in a set of sampling points, which is captured by the Jacobian matrix evaluated at the sampling points. The canonical polyadic decomposition of the three-way tensor of Jacobian matrices directly returns the unknown linear relations as well as the necessary information to reconstruct the univariate polynomials. The conditions under which this decoupling procedure works are discussed, and the method is illustrated on several numerical examples.
IFAC Proceedings Volumes | 2014
Maarten Schoukens; Koen Tiels; Mariya Ishteva; Johan Schoukens
This paper deals with the best low multilinear rank approximation of higher-order tensors. Given a tensor, we are looking for another tensor, as close as possible to the given one and with bounded multilinear rank. Higher-order tensors are used in higher-order statistics, signal processing, telecommunications and many other fields. In particular, the best low multilinear rank approximation is used as a tool for dimensionality reduction and signal subspace estimation.
SIAM Journal on Matrix Analysis and Applications | 2014
Mariya Ishteva; Konstantin Usevich; Ivan Markovsky
Matrix lower rank approximations such as non-negative matrix factorization (NMF) have been successfully used to solve many data mining tasks. In this paper, we propose a new matrix lower rank approximation called Bounded Matrix Low Rank Approximation (BMA) which imposes a lower and an upper bound on every element of a lower rank matrix that best approximates a given matrix with missing elements. This new approximation models many real world problems, such as recommender systems, and performs better than other methods, such as singular value decompositions (SVD) or NMF. We present an efficient algorithm to solve BMA based on coordinate descent method. BMA is different from NMF as it imposes bounds on the approximation itself rather than on each of the low rank factors. We show that our algorithm is scalable for large matrices with missing elements on multi core systems with low memory. We present substantial experimental results illustrating that the proposed method outperforms the state of the art algorithms for recommender systems such as Stochastic Gradient Descent, Alternating Least Squares with regularization, SVD++, Bias-SVD on real world data sets such as Jester, Movie lens, Book crossing, Online dating and Netflix.