Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert Jenssen is active.

Publication


Featured researches published by Robert Jenssen.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2010

Kernel Entropy Component Analysis

Robert Jenssen

We introduce kernel entropy component analysis (kernel ECA) as a new method for data transformation and dimensionality reduction. Kernel ECA reveals structure relating to the Renyi entropy of the input space data set, estimated via a kernel matrix using Parzen windowing. This is achieved by projections onto a subset of entropy preserving kernel principal component analysis (kernel PCA) axes. This subset does not need, in general, to correspond to the top eigenvalues of the kernel matrix, in contrast to the dimensionality reduction using kernel PCA. We show that kernel ECA may produce strikingly different transformed data sets compared to kernel PCA, with a distinct angle-based structure. A new spectral clustering algorithm utilizing this structure is developed with positive results. Furthermore, kernel ECA is shown to be an useful alternative for pattern denoising.


Pattern Recognition | 2003

Independent component analysis for texture segmentation

Robert Jenssen; Torbjørn Eltoft

Independent component analysis (ICA) of textured images is presented as a computational technique for creating a new data dependent filter bank for use in texture segmentation. We show that the ICA filters are able to capture the inherent properties of textured images. The new filters are similar to Gabor filters, but seem to be richer in the sense that their frequency responses may be more complex. These properties enable us to use the ICA filter bank to create energy features for effective texture segmentation. Our experiments using multi-textured images show that the ICA filter bank yields similar or better segmentation results than the Gabor filter bank.


Journal of The Franklin Institute-engineering and Applied Mathematics | 2006

The Cauchy–Schwarz divergence and Parzen windowing: Connections to graph theory and Mercer kernels

Robert Jenssen; Jose C. Principe; Deniz Erdogmus; Torbjørn Eltoft

Abstract This paper contributes a tutorial level discussion of some interesting properties of the recent Cauchy–Schwarz (CS) divergence measure between probability density functions. This measure brings together elements from several different machine learning fields, namely information theory, graph theory and Mercer kernel and spectral theory. These connections are revealed when estimating the CS divergence non-parametrically using the Parzen window technique for density estimation. An important consequence of these connections is that they enhance our understanding of the different machine learning schemes relative to each other.


Pattern Recognition | 2008

Mean shift spectral clustering

Umut Ozertem; Deniz Erdogmus; Robert Jenssen

In recent years there has been a growing interest in clustering methods stemming from the spectral decomposition of the data affinity matrix, which are shown to present good results on a wide variety of situations. However, a complete theoretical understanding of these methods in terms of data distributions is not yet well understood. In this paper, we propose a spectral clustering based mode merging method for mean shift as a theoretically well-founded approach that enables a probabilistic interpretation of affinity based clustering through kernel density estimation. This connection also allows principled kernel optimization and enables the use of anisotropic variable-size kernels to match local data structures. We demonstrate the proposed algorithms performance on image segmentation applications and compare its clustering results with the well-known Mean Shift and Normalized Cut algorithms.


Pattern Recognition | 2007

Information cut for clustering using a gradient descent approach

Robert Jenssen; Deniz Erdogmus; Kenneth E. Hild; Jose C. Principe; Torbjørn Eltoft

We introduce a new graph cut for clustering which we call the Information Cut. It is derived using Parzen windowing to estimate an information theoretic distance measure between probability density functions. We propose to optimize the Information Cut using a gradient descent-based approach. Our algorithm has several advantages compared to many other graph-based methods in terms of determining an appropriate affinity measure, computational complexity, memory requirements and coping with different data scales. We show that our method may produce clustering and image segmentation results comparable or better than the state-of-the art graph-based methods.


signal processing systems | 2006

Some Equivalences between Kernel Methods and Information Theoretic Methods

Robert Jenssen; Torbjørn Eltoft; Deniz Erdogmus; Jose C. Principe

In this paper, we discuss some equivalences between two recently introduced statistical learning schemes, namely Mercer kernel methods and information theoretic methods. We show that Parzen window-based estimators for some information theoretic cost functions are also cost functions in a corresponding Mercer kernel space. The Mercer kernel is directly related to the Parzen window. Furthermore, we analyze a classification rule based on an information theoretic criterion, and show that this corresponds to a linear classifier in the kernel space. By introducing a weighted Parzen window density estimator, we also formulate the support vector machine in this information theoretic perspective.


energy minimization methods in computer vision and pattern recognition | 2005

Optimizing the cauchy-schwarz PDF distance for information theoretic, non-parametric clustering

Robert Jenssen; Deniz Erdogmus; Kenneth E. Hild; Jose C. Principe; Torbjørn Eltoft

This paper addresses the problem of efficient information theoretic, non-parametric data clustering. We develop a procedure for adapting the cluster memberships of the data patterns, in order to maximize the recent Cauchy-Schwarz (CS) probability density function (pdf) distance measure. Each pdf corresponds to a cluster. The CS distance is estimated analytically and non-parametrically by means of the Parzen window technique for density estimation. The resulting form of the cost function makes it possible to develop an efficient adaption procedure based on constrained gradient descent, using stochastic approximation of the gradients. The computational complexity of the algorithm is O(MN), M ≪ N, where N is the total number of data patterns and M is the number of data patterns used in the stochastic approximation. We show that the new algorithm is capable of performing well on several odd-shaped and irregular data sets.


IEEE Geoscience and Remote Sensing Letters | 2012

Kernel Entropy Component Analysis for Remote Sensing Image Clustering

Luis Gómez-Chova; Robert Jenssen; Gustavo Camps-Valls

This letter proposes the kernel entropy component analysis for clustering remote sensing data. The method generates nonlinear features that reveal structure related to the Rényi entropy of the input space data set. Unlike other kernel feature-extraction methods, the top eigenvalues and eigenvectors of the kernel matrix are not necessarily chosen. Data are interestingly mapped with a distinct angular structure, which is exploited to derive a new angle-based spectral clustering algorithm based on the mapped data. An out-of-sample extension of the method is also presented to deal with test data. We focus on cloud screening from Medium Resolution Imaging Spectrometer images. Several images are considered to account for the high variability of the problem. Good results obtained show the suitability of the proposal.


Pattern Recognition | 2006

Spectral feature projections that maximize Shannon mutual information with class labels

Umut Ozertem; Deniz Erdogmus; Robert Jenssen

Determining optimal subspace projections that can maintain task-relevant information in the data is an important problem in machine learning and pattern recognition. In this paper, we propose a nonparametric nonlinear subspace projection technique that maintains class separability maximally under the Shannon mutual information (MI) criterion. Employing kernel density estimates for nonparametric estimation of MI makes possible an interesting marriage of kernel density estimation-based information theoretic methods and kernel machines, which have the ability to determine nonparametric nonlinear solutions for difficult problems in machine learning. Significant computational savings are achieved by translating the definition of the desired projection into the kernel-induced feature space, which leads to obtain analytical solution.


Neurocomputing | 2008

A new information theoretic analysis of sum-of-squared-error kernel clustering

Robert Jenssen; Torbjørn Eltoft

The contribution of this paper is to provide a new input space analysis of the properties of sum-of-squared-error K-means clustering performed in a Mercer kernel feature space. Such an analysis has been missing until now, even though kernel K-means has been popular in the clustering literature. Our derivation extends the theory of traditional K-means from properties of mean vectors to information theoretic properties of Parzen window estimated probability density functions (pdfs). In particular, Euclidean distance-based kernel K-means is shown to maximize an integrated squared error divergence measure between cluster pdfs and the overall pdf of the data, while a cosine similarity-based approach maximizes a Cauchy-Schwarz divergence measure. Furthermore, the iterative rules which assign data points to clusters in order to maximize these criteria are shown to depend on the cluster pdfs evaluated at the data points, in addition to the Renyi entropies of the clusters. The Bayes rule is shown to be a special case.

Collaboration


Dive into the Robert Jenssen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Antonello Rizzi

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Arthur Revhaug

University Hospital of North Norway

View shared research outputs
Researchain Logo
Decentralizing Knowledge