Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Juha Karhunen is active.

Publication


Featured researches published by Juha Karhunen.


Journal of Mathematical Analysis and Applications | 1985

On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix

Erkki Oja; Juha Karhunen

Abstract In applications of signal processing and pattern recognition, eigenvectors and eigenvalues of the statistical mean of a random matrix sequence are needed. Iterative methods are suggested and analyzed, in which no sample moments are used. Convergence is shown by stochastic approximation theory.


IEEE Transactions on Neural Networks | 1997

A class of neural networks for independent component analysis

Juha Karhunen; Erkki Oja; Liuyue Wang; Ricardo Vigário; Jyrki Joutsensalo

Independent component analysis (ICA) is a recently developed, useful extension of standard principal component analysis (PCA). The ICA model is utilized mainly in blind separation of unknown source signals from their linear mixtures. In this application only the source signals which correspond to the coefficients of the ICA expansion are of interest. In this paper, we propose neural structures related to multilayer feedforward networks for performing complete ICA. The basic ICA network consists of whitening, separation, and basis vector estimation layers. It can be used for both blind source separation and estimation of the basis vectors of ICA. We consider learning algorithms for each layer, and modify our previous nonlinear PCA type algorithms so that their separation capabilities are greatly improved. The proposed class of networks yields good results in test examples with both artificial and real-world data.


Neural Networks | 1994

Representation and separation of signals using nonlinear PCA type learning

Juha Karhunen; Jyrki Joutsensalo

Abstract A class of nonlinear PCA (principal component analysis) type learning algorithms is derived by minimizing a general statistical signal representation error. Another related algorithm is derived from a nonlinear feature extraction criterion. Several known algorithms emerge as special cases of these optimization approaches that provide useful information on the properties of the algorithms. By taking into account higher-order statistics, nonlinear algorithms are often able to separate component signals from their mixture. This is not possible with linear principal component subspace estimation algorithms. A suitably chosen nonlinearity makes the results more robust against various types of noise. Estimation of noisy sinusoids is used as a demonstration example.


Neural Networks | 1995

Generalizations of principal component analysis, optimization problems, and neural networks

Juha Karhunen; Jyrki Joutsensalo

Abstract We derive and discuss various generalizations of neural PCA (Principal Component Analysis)-type learning algorithms containing nonlinearities using optimization-based approach. Standard PCA arises as an optimal solution to several different information representation problems. We justify that this is essentially due to the fact that the solution is based on the second-order statistics only. If the respective optimization problems are generalized for nonquadratic criteria so that higher-order statistics are taken into account, their solutions will in general be different. The solutions define in a natural way several meaningful extensions of PCA and give a solid foundation for them. In this framework, we study more closely generalizations of the problems of variance maximization and mean-square error minimization. For these problems, we derive gradient-type neural learning algorithms both for symmetric and hierarchic PCA-type networks. As an important special case, the well-known Sangers generalized Hebbian algorithm (GHA) is shown to emerge from natural optimization problems.


Neural Computation | 2002

An unsupervised ensemble learning method for nonlinear dynamic state-space models

Harri Valpola; Juha Karhunen

A Bayesian ensemble learning method is introduced for unsupervised extraction of dynamic processes from noisy data. The data are assumed to be generated by an unknown nonlinear mapping from unknown factors. The dynamics of the factors are modeled using a nonlinear state-space model. The nonlinear mappings in the model are represented using multilayer perceptron networks. The proposed method is computationally demanding, but it allows the use of higher-dimensional nonlinear latent variable models than other existing approaches. Experiments with chaotic data show that the new method is able to blindly estimate the factors and the dynamic process that generated the data. It clearly outperforms currently available nonlinear prediction techniques in this very difficult test problem.


International Journal of Neural Systems | 2004

ADVANCES IN BLIND SOURCE SEPARATION (BSS) AND INDEPENDENT COMPONENT ANALYSIS (ICA) FOR NONLINEAR MIXTURES

Christian Jutten; Juha Karhunen

In this paper, we review recent advances in blind source separation (BSS) and independent component analysis (ICA) for nonlinear mixing models. After a general introduction to BSS and ICA, we discuss in more detail uniqueness and separability issues, presenting some new results. A fundamental difficulty in the nonlinear BSS problem and even more so in the nonlinear ICA problem is that they provide non-unique solutions without extra constraints, which are often implemented by using a suitable regularization. In this paper, we explore two possible approaches. The first one is based on structural constraints. Especially, post-nonlinear mixtures are an important special case, where a nonlinearity is applied to linear mixtures. For such mixtures, the ambiguities are essentially the same as for the linear ICA or BSS problems. The second approach uses Bayesian inference methods for estimating the best statistical parameters, under almost unconstrained models in which priors can be easily added. In the later part of this paper, various separation techniques proposed for post-nonlinear mixtures and general nonlinear mixtures are reviewed.


Neurocomputing | 1999

Neural networks for blind separation with unknown number of sources

Andrzej Cichocki; Juha Karhunen; Włodzimierz Kasprzak; Ricardo Vigário

Blind source separation problems have recently drawn a lot of attention in unsupervised neural learning. In the current approaches, the number of sources is typically assumed to be known in advance, but this does not usually hold in practical applications. In this paper, various neural network architectures and associated adaptive learning algorithms are discussed for handling the cases where the number of sources is unknown. These techniques include estimation of the number of sources, redundancy removal among the outputs of the networks, and extraction of the sources one at a time. Validity and performance of the described approaches are demonstrated by extensive computer simulations for natural image and magnetoencephalographic (MEG) data. ( 1999 Elsevier Science B.V. All rights reserved.


Neurocomputing | 1998

The Nonlinear PCA Criterion in Blind Source Separation: Relations with Other Approaches

Juha Karhunen; Petteri Pajunen; Erkki Oja

Abstract We present new results on the nonlinear principal component analysis (PCA) criterion in blind source separation (BSS). We derive the criterion in a form that allows easy comparisons with other BSS and independent component analysis (ICA) contrast functions like cumulants, Bussgang criteria, and information theoretic contrasts. This clarifies how the nonlinearity should be chosen optimally. We also discuss the connections of the nonlinear PCA learning rule with the Bell-Sejnowski algorithm and the adaptive EASI algorithm. Furthermore, we show that a nonlinear PCA criterion can be minimized using least-squares approaches, leading to computationally efficient and fast converging algorithms. The paper shows that nonlinear PCA is a versatile starting point for deriving different kinds of algorithms for blind signal processing problems.


International Journal of Neural Systems | 1999

AN EXPERIMENTAL COMPARISON OF NEURAL ALGORITHMS FOR INDEPENDENT COMPONENT ANALYSIS AND BLIND SEPARATION

Xavier Giannakopoulos; Juha Karhunen; Erkki Oja

In this paper, we compare the performance of five prominent neural or adaptive algorithms designed for Independent Component Analysis (ICA) and blind source separation (BSS). In the first part of the study, we use artificial data for comparing the accuracy, convergence speed, computational load, and other relevant properties of the algorithms. In the second part, the algorithms are applied to three different real-world data sets. The task is either blind source separation or finding interesting directions in the data for visualisation purposes. We develop criteria for selecting the most meaningful basis vectors of ICA and measuring the quality of the results. The comparison reveals characteristic differences between the studied ICA algorithms. The most important conclusions of our comparison are robustness of the ICA algorithms with respect to modest modeling imperfections, and the superiority of fixed-point algorithms with respect to the computational load.


international conference on acoustics, speech, and signal processing | 1997

Applications of neural blind separation to signal and image processing

Juha Karhunen; Aapo Hyvärinen; Ricardo Vigário; Jarmo Hurri; Erkki Oja

In blind source separation one tries to separate statistically independent unknown source signals from their linear mixtures without knowing the mixing coefficients. Such techniques are currently studied actively both in statistical signal processing and unsupervised neural learning. We apply neural blind separation techniques developed in our laboratory to the extraction of features from natural images and to the separation of medical EEG signals. The new analysis method yields features that describe the underlying data better than for example classical principal component analysis. We discuss difficulties related with real-world applications of blind signal processing, too.

Collaboration


Dive into the Juha Karhunen's collaboration.

Top Co-Authors

Avatar

Erkki Oja

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jyrki Joutsensalo

Information Technology University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Antti Honkela

Helsinki Institute for Information Technology

View shared research outputs
Top Co-Authors

Avatar

Harri Valpola

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Petteri Pajunen

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge