Pierre Comon
Centre national de la recherche scientifique
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Pierre Comon.
Signal Processing | 1994
Pierre Comon
Abstract The independent component analysis (ICA) of a random vector consists of searching for a linear transformation that minimizes the statistical dependence between its components. In order to define suitable search criteria, the expansion of mutual information is utilized as a function of cumulants of increasing orders. An efficient algorithm is proposed, which allows the computation of the ICA of a data matrix within a polynomial time. The concept of ICA may actually be seen as an extension of the principal component analysis (PCA), which can only impose independence up to the second order and, consequently, defines directions that are orthogonal. Potential applications of ICA include data analysis and compression, Bayesian detection, localization of sources, and blind identification and deconvolution.
Proceedings of the IEEE | 1990
Pierre Comon; Gene H. Golub
In various applications it is necessary to keep track of a low-rank approximation of a covariance matrix, R(t), slowly varying with time. It is convenient to track the left singular vectors associated with the largest singular values of the triangular factor, L(t), of its Cholesky factorization. These algorithms are referred to as square-root. The drawback of the eigenvalue decomposition (EVD) or the singular value decompositions (SVD) is usually the volume of the computations. Various numerical methods for carrying out this task are surveyed, and it is shown why this heavy computational burden is questionable in numerous situations and should be revised. Indeed, the complexity per eigenpair is generally a quadratic function of the problem size, but there exist faster algorithms with linear complexity. Finally, in order to make a choice among the large and fuzzy set of available techniques, comparisons based on computer simulations in a relevant signal processing context are made. >
SIAM Journal on Matrix Analysis and Applications | 2008
Pierre Comon; Gene H. Golub; Lek-Heng Lim; Bernard Mourrain
A symmetric tensor is a higher order generalization of a symmetric matrix. In this paper, we study various properties of symmetric tensors in relation to a decomposition into a symmetric sum of outer product of vectors. A rank-1 order-
Signal Processing | 1996
Pierre Comon; Bernard Mourrain
k
IEEE Transactions on Neural Networks | 2010
Vicente Zarzoso; Pierre Comon
tensor is the outer product of
IEEE Signal Processing Letters | 1996
Pierre Comon
k
SIAM Journal on Matrix Analysis and Applications | 2008
Myriam Rajih; Pierre Comon; Richard Harshman
nonzero vectors. Any symmetric tensor can be decomposed into a linear combination of rank-1 tensors, each of which is symmetric or not. The rank of a symmetric tensor is the minimal number of rank-1 tensors that is necessary to reconstruct it. The symmetric rank is obtained when the constituting rank-1 tensors are imposed to be themselves symmetric. It is shown that rank and symmetric rank are equal in a number of cases and that they always exist in an algebraically closed field. We will discuss the notion of the generic symmetric rank, which, due to the work of Alexander and Hirschowitz [J. Algebraic Geom., 4 (1995), pp. 201-222], is now known for any values of dimension and order. We will also show that the set of symmetric tensors of symmetric rank at most
IEEE Signal Processing Magazine | 2014
Pierre Comon
r
IEEE Signal Processing Magazine | 2008
Amar Kachenoura; Laurent Albera; Lotfi Senhadji; Pierre Comon
is not closed unless
IEEE Transactions on Signal Processing | 2005
Pascal Chevalier; Laurent Albera; Anne Ferreol; Pierre Comon
r=1