Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Harald Stögbauer is active.

Publication


Featured researches published by Harald Stögbauer.


Physical Review E | 2004

Estimating Mutual Information

Alexander Kraskov; Harald Stögbauer; Peter Grassberger

We present two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y). In contrast to conventional estimators based on binnings, they are based on entropy estimates from k -nearest neighbor distances. This means that they are data efficient (with k=1 we resolve structures down to the smallest possible scales), adaptive (the resolution is higher where data are more numerous), and have minimal bias. Indeed, the bias of the underlying entropy estimates is mainly due to nonuniformity of the density at the smallest resolved scale, giving typically systematic errors which scale as functions of k/N for N points. Numerically, we find that both families become exact for independent distributions, i.e. the estimator M(X,Y) vanishes (up to statistical fluctuations) if mu(x,y)=mu(x)mu(y). This holds for all tested marginal distributions and for all dimensions of x and y. In addition, we give estimators for redundancies between more than two random variables. We compare our algorithms in detail with existing algorithms. Finally, we demonstrate the usefulness of our estimators for assessing the actual independence of components obtained from independent component analysis (ICA), for improving ICA, and for estimating the reliability of blind source separation.


Analytical Chemistry | 2006

Monte Carlo Algorithm for Least Dependent Non-Negative Mixture Decomposition

Sergey A. Astakhov; Harald Stögbauer; Alexander Kraskov; Peter Grassberger

We propose a simulated annealing algorithm (stochastic non-negative independent component analysis, SNICA) for blind decomposition of linear mixtures of non-negative sources with non-negative coefficients. The demixing is based on a Metropolis-type Monte Carlo search for least dependent components, with the mutual information between recovered components as a cost function and their non-negativity as a hard constraint. Elementary moves are shears in two-dimensional subspaces and rotations in three-dimensional subspaces. The algorithm is geared at decomposing signals whose probability densities peak at zero, the case typical in analytical spectroscopy and multivariate curve resolution. The decomposition performance on large samples of synthetic mixtures and experimental data is much better than that of traditional blind source separation methods based on principal component analysis (MILCA, FastICA, RADICAL) and chemometrics techniques (SIMPLISMA, ALS, BTEM).


international conference on independent component analysis and signal separation | 2004

Reliability of ICA Estimates with Mutual Information

Harald Stögbauer; Ralph G. Andrzejak; Alexander Kraskov; Peter Grassberger

Obtaining the most independent components from a mixture (under a chosen model) is only the first part of an ICA analysis. After that, it is necessary to measure the actual dependency between the components and the reliability of the decomposition. We have to identify one- and multidimensional components (i.e., clusters of mutually dependent components) or channels which are too close to Gaussians to be reliably separated. For the determination of the dependencies we use a new highly accurate mutual information (MI) estimator. The variability of the MI under remixing provides us a measure for the stability. A rapid growth of the MI under mixing identifies stable components. On the other hand a low variability identifies unreliable components. The method is illustrated on artificial datasets. The usefulness in real-world data is shown on biomedical data.


Physical Review E | 2004

Least-dependent-component analysis based on mutual information.

Harald Stögbauer; Alexander Kraskov; Sergey A. Astakhov; Peter Grassberger


EPL | 2005

Hierarchical clustering using mutual information

Alexander Kraskov; Harald Stögbauer; Ralph G. Andrzejak; Peter Grassberger


Physical Review E | 2004

Measure profile surrogates: A method to validate the performance of epileptic seizure prediction algorithms

Thomas Kreuz; Ralph G. Andrzejak; Florian Mormann; Alexander Kraskov; Harald Stögbauer; Christian E. Elger; Klaus Lehnertz; Peter Grassberger


Physical Review E | 2003

Bivariate surrogate techniques: Necessity, strengths, and caveats

Ralph G. Andrzejak; Alexander Kraskov; Harald Stögbauer; Florian Mormann; Thomas Kreuz


arXiv: Quantitative Methods | 2003

Hierarchical Clustering Based on Mutual Information

Alexander Kraskov; Harald Stögbauer; Ralph G. Andrzejak; Peter Grassberger


Physical Review Letters | 2004

Comment on "Linguistic analysis of the human heartbeat using frequency and rank order statistics".

Alexander Kraskov; Walter Nadler; Harald Stögbauer; Peter Grassberger


Physical Review E | 2011

Erratum: Estimating mutual information [Phys. Rev. E 69 , 066138 (2004)]

Alexander Kraskov; Harald Stögbauer; Peter Grassberger

Collaboration


Dive into the Harald Stögbauer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Walter Nadler

Forschungszentrum Jülich

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas Kreuz

University of California

View shared research outputs
Top Co-Authors

Avatar

Albert C. Yang

Beth Israel Deaconess Medical Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge