Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kaare Brandt Petersen is active.

Publication


Featured researches published by Kaare Brandt Petersen.


IEEE Signal Processing Magazine | 2013

Kernel Multivariate Analysis Framework for Supervised Subspace Learning: A Tutorial on Linear and Kernel Multivariate Methods

Jerónimo Arenas-García; Kaare Brandt Petersen; Gustavo Camps-Valls; Lars Kai Hansen

Feature extraction and dimensionality reduction are important tasks in many fields of science dealing with signal processing and analysis. The relevance of these techniques is increasing as current sensory devices are developed with ever higher resolution, and problems involving multimodal data sources become more common. A plethora of feature extraction methods are available in the literature collectively grouped under the field of multivariate analysis (MVA). This article provides a uniform treatment of several methods: principal component analysis (PCA), partial least squares (PLS), canonical correlation analysis (CCA), and orthonormalized PLS (OPLS), as well as their nonlinear extensions derived by means of the theory of reproducing kernel Hilbert spaces (RKHSs). We also review their connections to other methods for classification and statistical dependence estimation and introduce some recent developments to deal with the extreme cases of large-scale and low-sized problems. To illustrate the wide applicability of these methods in both classification and regression problems, we analyze their performance in a benchmark of publicly available data sets and pay special attention to specific real applications involving audio processing for music genre prediction and hyperspectral satellite image processing for Earth and climate monitoring.


Neural Computation | 2005

On the Slow Convergence of EM and VBEM in Low-Noise Linear Models

Kaare Brandt Petersen; Ole Winther; Lars Kai Hansen

We analyze convergence of the expectation maximization (EM) and variational Bayes EM (VBEM) schemes for parameter estimation in noisy linear models. The analysis shows that both schemes are inefficient in the low-noise limit. The linear model with additive noise includes as special cases independent component analysis, probabilistic principal component analysis, factor analysis, and Kalman filtering. Hence, the results are relevant for many practical applications.


Digital Signal Processing | 2007

Bayesian independent component analysis: Variational methods and non-negative decompositions

Ole Winther; Kaare Brandt Petersen

In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine of the method are two mean field techniques-the variational Bayes and the expectation consistent framework-and the cost function relating to these methods are optimized using the adaptive overrelaxed expectation maximization (EM) algorithm and the easy gradient recipe. The entire framework, implemented in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization.


Neural Computation | 2007

State-Space Models: From the EM Algorithm to a Gradient Approach

Rasmus Kongsgaard Olsson; Kaare Brandt Petersen; Tue Lehn-Schiøler

Slow convergence is observed in the EM algorithm for linear state-space models. We propose to circumvent the problem by applying any off-the-shelf quasi-Newton-type optimizer, which operates on the gradient of the log-likelihood function. Such an algorithm is a practical alternative due to the fact that the exact gradient of the log-likelihood function can be computed by recycling components of the expectation-maximization (EM) algorithm. We demonstrate the efficiency of the proposed method in three relevant instances of the linear state-space model. In high signal-to-noise ratios, where EM is particularly prone to converge slowly, we show that gradient-based learning results in a sizable reduction of computation time.


Neurocomputing | 2007

Flexible and efficient implementations of Bayesian independent component analysis

Ole Winther; Kaare Brandt Petersen

In this paper we present an empirical Bayes method for flexible and efficient independent component analysis (ICA). The method is flexible with respect to choice of source prior, dimensionality and constraints of the mixing matrix (unconstrained or non-negativity), and structure of the noise covariance matrix. Parameter optimization is handled by variants of the expectation maximization (EM) algorithm: overrelaxed adaptive EM and the easy gradient recipe. These retain the simplicity of EM while converging faster. The required expectations over the source posterior, the sufficient statistics, are estimated with mean field methods: variational and the expectation consistent (EC) framework. We describe the derivation of the EC framework for ICA in detail and give empirical results demonstrating the improved performance. The paper is accompanied by the publicly available Matlab toolbox icaMF.


international symposium/conference on music information retrieval | 2006

Mel Frequency Cepstral Coefficients: An Evaluation of Robustness of MP3 Encoded Music

Sigurdur Sigurdsson; Kaare Brandt Petersen; Tue Lehn-Schiøler


neural information processing systems | 2006

Sparse Kernel Orthonormalized PLS for feature extraction in large data sets

Jerónimo Arenas-García; Kaare Brandt Petersen; Lars Kai Hansen


international workshop on machine learning for signal processing | 2007

Unveiling Music Structure via PLSA Similarity Fusion

Jerónimo Arenas-García; Anders Meng; Kaare Brandt Petersen; Tue Lehn-Schiøler; Lars Kai Hansen; Jan Larsen


Kernel Methods for Remote Sensing Data Analysis | 2009

Kernel Multivariate Analysis in Remote Sensing Feature Extraction

Jerónimo Arenas-García; Kaare Brandt Petersen


international conference on acoustics, speech, and signal processing | 2005

The EM algorithm in independent component analysis

Kaare Brandt Petersen; Ole Winther

Collaboration


Dive into the Kaare Brandt Petersen's collaboration.

Top Co-Authors

Avatar

Lars Kai Hansen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Tue Lehn-Schiøler

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Ole Winther

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Jan Larsen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anders Meng

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Rasmus Kongsgaard Olsson

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Sigurdur Sigurdsson

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Sune Lehmann

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge