Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pando G. Georgiev is active.

Publication


Featured researches published by Pando G. Georgiev.


IEEE Transactions on Neural Networks | 2005

Sparse component analysis and blind source separation of underdetermined mixtures

Pando G. Georgiev; Fabian J. Theis; Andrzej Cichocki

In this letter, we solve the problem of identifying matrices S /spl isin/ /spl Ropf//sup n/spl times/N/ and A /spl isin/ /spl Ropf//sup m/spl times/n/ knowing only their multiplication X = AS, under some conditions, expressed either in terms of A and sparsity of S (identifiability conditions), or in terms of X (sparse component analysis (SCA) conditions). We present algorithms for such identification and illustrate them by examples.


Set-valued Analysis | 1996

Second-order subdifferentials of C 1,1 functions and optimality conditions

Pando G. Georgiev; Nadia Zlateva

We present second-order subdifferentials of Clarkes type of C1,1 functions, defined in Banach spaces with separable duals. One of them is an extension of the generalized Hessian matrix of such functions in ℝn, considered by J. B. H.-Urruty, J. J. Strodiot and V. H. Nguyen. Various properties of these subdifferentials are proved. Second-order optimality conditions (necessary, sufficient) for constrained minimization problems with C1,1 data are obtained.


Archive | 2007

Sparse Component Analysis: a New Tool for Data Mining

Pando G. Georgiev; Fabian J. Theis; Andrzej Cichocki; Hovagim Bakardjian

In many practical problems for data mining the data X under consideration (given as (m × N)-matrix) is of the form X = AS, where the matrices A and S with dimensions m×n and n × N respectively (often called mixing matrix or dictionary and source matrix) are unknown (m ≤ n < N). We formulate conditions (SCA-conditions) under which we can recover A and S uniquely (up to scaling and permutation), such that S is sparse in the sense that each column of S has at least one zero element. We call this the Sparse Component Analysis problem (SCA). We present new algorithms for identification of the mixing matrix (under SCA-conditions), and for source recovery (under identifiability conditions). The methods are illustrated with examples showing good performance of the algorithms. Typical examples are EEG and fMRI data sets, in which the SCA algorithm allows us to detect some features of the brain signals. Special attention is given to the application of our method to the transposed system X T = S T A T utilizing the sparseness of the mixing matrix A in appropriate situations. We note that the sparseness conditions could be obtained with some preprocessing methods and no independence conditions for the source signals are imposed (in contrast to Independent Component Analysis). We applied our method to fMRI data sets with dimension (128 × 128 × 98) and to EEG data sets from a 256-channels EEG machine.


Set-valued Analysis | 1997

Submonotone Mappings in Banach Spaces and Applications

Pando G. Georgiev

The notions ‘submonotone’ and ‘strictly submonotone’ mapping, introduced by J. Spingarn in ℝn, are extended in a natural way to arbitrary Banach spaces. Several results about monotone operators are proved for submonotone and strictly submonotone ones: Rockafellars result about local boundedness of monotone operators; Kenderovs result about single-valuedness and upper-semicontinuity almost everywhere of monotone operators in Asplund spaces; minimality (w*-cusco mappings) of maximal strictly submonotone mappings, etc. It is shown that subdifferentials of various classes of nonconvex functions defined as pointwise suprema of quasi-differentiable functions possess submonotone properties. Results about generic differentiability of such functions are obtained (among them are new generalizations of an Ekeland and Lebourgs theorem). Applications are given to the properties of the distance function in a Banach space with a uniformly Gateaux differentiable norm.


information sciences, signal processing and their applications | 2001

Blind source separation via symmetric eigenvalue decomposition

Pando G. Georgiev; Andrzej Cichocki

We propose a sufficient condition for separation of colored source signals with temporal structure, stating that the separation is possible, if the source signals have different auto-correlation functions. We show that the problem of blind source separation of uncorrelated colored signals can be converted to a symmetric eigenvalue problem of a special covariance matrix Z(b)=/spl Sigma//sub i=1//sup L/b(p/sub i/)R/sub z/(p/sub i/) depending on L-dimensional parameter b, if this matrix has distinct eigenvalues. We prove that the parameters b for which this is possible, form an open subset of R/sup L/, whose complement has a Lebesgue measure zero. A robust orthogonalization of the mixing matrix is used, which is not sensitive to the white noise. We propose a new one-step algorithm, based on non-smooth optimization theory, which disperses the eigenvalues of the matrix Z(b) providing sufficient distance between them.


European Journal of Operational Research | 2013

Robust aspects of solutions in deterministic multiple objective linear programming

Pando G. Georgiev; Panos M. Pardalos

We study questions of robustness of linear multiple objective problems in the sense of post-optimal analysis, that is, we study conditions under which a given efficient solution remains efficient when the criteria/objective matrix undergoes some alterations. We consider addition or removal of certain criteria, convex combination with another criteria matrix, or small perturbations of its entries. We provide a necessary and sufficient condition for robustness in a verifiable form and give two formulae to compute the radius of robustness.


Transactions of the American Mathematical Society | 2003

Integration of multivalued operators and cyclic submonotonicity

Aris Daniilidis; Pando G. Georgiev; Jean-Paul Penot

We introduce a notion of cyclic submonotonicity for multivalued operators from a Banach space X to its dual. We show that if the Clarke subdifferential of a locally Lipschitz function is strictly submonotone on an open subset U of X, then it is also maximal cyclically submonotone on U, and, conversely, that every maximal cyclically submonotone operator on U is the Clarke subdifferential of a locally Lipschitz function, which is unique up to a constant if U is connected. In finite dimensions these functions are exactly the lower C 1 functions considered by Spingarn and Rockafellar.


Proceedings of the American Mathematical Society | 2005

Parametric Borwein-Preiss variational principle and applications

Pando G. Georgiev

ABSTHACT. A parametric version of the Borwein-Preiss smooth variational principle is presented, which states that under suitable assumptions on a given convex function depending on a parameter, the minimum point of a smooth convex perturbation of it. depends continuously on the parameter. Some applications are given: existence of a Nash equilibrium and a solution of a variational inequality for a system of partially convex functions, perturbed by arbitrarily small smooth convex perturbations when one of the functions has a non-compact domain; a parametric version of the Knhn-Tucker theorem which contains a parametric smooth variational principle with constraints; existence of a continuous selection of a subdifferential mapping depending on a parameter. The tool for proving this parametric smooth variational principle is a useful lemma about continuous e-minimizers of quasi-convex functions depending on a parameter, which has independent interest since it allows direct proofs of Ky Fans minimax inequality, minimax equalities for quasi-convex functions, Sions minimax theorem, etc.


EURASIP Journal on Advances in Signal Processing | 2007

Robust sparse component analysis based on a generalized Hough transform

Fabian J. Theis; Pando G. Georgiev; Andrzej Cichocki

An algorithm called Hough SCA is presented for recovering the matrix in, where is a multivariate observed signal, possibly is of lower dimension than the unknown sources. They are assumed to be sparse in the sense that at every time instant, has fewer nonzero elements than the dimension of. The presented algorithm performs a global search for hyperplane clusters within the mixture space by gathering possible hyperplane parameters within a Hough accumulator tensor. This renders the algorithm immune to the many local minima typically exhibited by the corresponding cost function. In contrast to previous approaches, Hough SCA is linear in the sample number and independent of the source dimension as well as robust against noise and outliers. Experiments demonstrate the flexibility of the proposed algorithm.


international symposium on circuits and systems | 2004

Beyond ICA: robust sparse signal representations

A. Chichocki; Yuanqing Li; Pando G. Georgiev; S. Amari

In many applications it is necessary to perform some decomposition of observed signals or data in such a way that components have some special properties or structures such as statistical independence, sparsity, smoothness, nonnegativity, prescribed statistical distributions and/or specific temporal structure. In this paper we discuss cost functions whose minimization solve such problems and we present new properties that characterize optimal solutions for sparse representations. Especially, we discuss robust cost functions in order to find sparse representation of noisy signals. Furthermore, we discuss sub-band decomposition preprocessing to relax independence conditions for source signals.

Collaboration


Dive into the Pando G. Georgiev's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrzej Cichocki

Warsaw University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrzej Cichocki

Warsaw University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrzej Cichocki

Warsaw University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aris Daniilidis

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hovagim Bakardjian

RIKEN Brain Science Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge