Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mikkel N. Schmidt is active.

Publication


Featured researches published by Mikkel N. Schmidt.


international conference on independent component analysis and signal separation | 2006

Nonnegative matrix factor 2-d deconvolution for blind single channel source separation

Mikkel N. Schmidt; Morten Mørup

We present a novel method for blind separation of instruments in single channel polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm. The method is an extention of NMFD recently introduced by Smaragdis [1]. Using a model which is convolutive in both time and frequency we factorize a spectrogram representation of music into components corresponding to individual instruments. Based on this factorization we separate the instruments using spectrogram masking. The proposed algorithm has applications in computational auditory scene analysis, music information retrieval, and automatic music transcription.


international conference on independent component analysis and signal separation | 2009

Bayesian Non-negative Matrix Factorization

Mikkel N. Schmidt; Ole Winther; Lars Kai Hansen

We present a Bayesian treatment of non-negative matrix factorization (NMF), based on a normal likelihood and exponential priors, and derive an efficient Gibbs sampler to approximate the posterior density of the NMF factors. On a chemical brain imaging data set, we show that this improves interpretability by providing uncertainty estimates. We discuss how the Gibbs sampler can be used for model order selection by estimating the marginal likelihood, and compare with the Bayesian information criterion. For computing the maximum a posteriori estimate we present an iterated conditional modes algorithm that rivals existing state-of-the-art NMF algorithms on an image feature extraction problem.


international workshop on machine learning for signal processing | 2007

Wind Noise Reduction using Non-Negative Sparse Coding

Mikkel N. Schmidt; Jan Larsen; Fu-Tien Hsiao

We introduce a new speaker independent method for reducing wind noise in single-channel recordings of noisy speech. The method is based on non-negative sparse coding and relies on a wind noise dictionary which is estimated from an isolated noise recording. We estimate the parameters of the model and discuss their sensitivity. We then compare the algorithm with the classical spectral subtraction method and the Qualcomm-ICSI-OGI noise reduction method. We optimize the sound quality in terms of signal-to-noise ratio and provide results on a noisy speech recognition task.


Computational Intelligence and Neuroscience | 2008

Nonnegative matrix factorization with Gaussian process priors

Mikkel N. Schmidt; Hans Laurberg

We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors. We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function. This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries. The method is demonstrated with an example from chemical shift brain imaging.


signal processing systems | 2011

Unmixing of Hyperspectral Images using Bayesian Non-negative Matrix Factorization with Volume Prior

Morten Arngren; Mikkel N. Schmidt; Jan Larsen

Hyperspectral imaging can be used in assessing the quality of foods by decomposing the image into constituents such as protein, starch, and water. Observed data can be considered a mixture of underlying characteristic spectra (endmembers), and estimating the constituents and their abundances requires efficient algorithms for spectral unmixing. We present a Bayesian spectral unmixing algorithm employing a volume constraint and propose an inference procedure based on Gibbs sampling. We evaluate the method on synthetic and real hyperspectral data of wheat kernels. Results show that our method perform as good or better than existing volume constrained methods. Further, our method gives credible intervals for the endmembers and abundances, which allows us to asses the confidence of the results.


international workshop on machine learning for signal processing | 2011

Infinite multiple membership relational modeling for complex networks

Morten Mørup; Mikkel N. Schmidt; Lars Kai Hansen

Learning latent structure in complex networks has become an important problem fueled by many types of networked data originating from practically all fields of science. In this paper, we propose a new non-parametric Bayesian multiple-membership latent feature model for networks. Contrary to existing multiple-membership models that scale quadratically in the number of vertices the proposed model scales linearly in the number of links admitting multiple-membership analysis in large scale networks. We demonstrate a connection between the single membership relational model and multiple membership models and show on “real” size benchmark network data that accounting for multiple memberships improves the learning of latent structure as measured by link prediction while explicitly accounting for multiple membership result in a more compact representation of the latent structure of networks.


IEEE Signal Processing Magazine | 2013

Nonparametric Bayesian modeling of complex networks: an introduction

Mikkel N. Schmidt; Morten Mørup

Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models for complex networks can be derived and point out relevant literature.


international workshop on machine learning for signal processing | 2009

Bayesian nonnegative Matrix Factorization with volume prior for unmixing of hyperspectral images

Morten Arngren; Mikkel N. Schmidt; Jan Larsen

In hyperspectral image analysis the objective is to unmix a set of acquired pixels into pure spectral signatures (end-members) and corresponding fractional abundances. The Non-negative Matrix Factorization (NMF) methods have received a lot of attention for this unmixing process. Many of these NMF based unmixing algorithms are based on sparsity regularization encouraging pure spectral endmembers, but this is not optimal for certain applications, such as foods, where abundances are not sparse. The pixels will theoretically lie on a simplex and hence the endmembers can be estimated as the vertices of the smallest enclosing simplex. In this context we present a Bayesian framework employing a volume constraint for the NMF algorithm, where the posterior distribution is numerically sampled from using a Gibbs sampling procedure. We evaluate the method on synthetical and real hyperspectral data of wheat kernels.


european conference on optical communication | 2012

Nonlinear impairment compensation using expectation maximization for dispersion managed and unmanaged PDM 16-QAM transmission

Darko Zibar; Ole Winther; Niccolo Franceschi; Robert Borkowski; Antonio Caballero; Valeria Arlunno; Mikkel N. Schmidt; Neil Guerrero Gonzales; Bangning Mao; Yabin Ye; Knud J. Larsen; Idelfonso Tafur Monroy

In this paper, we show numerically and experimentally that expectation maximization (EM) algorithm is a powerful tool in combating system impairments such as fibre nonlinearities, inphase and quadrature (I/Q) modulator imperfections and laser linewidth. The EM algorithm is an iterative algorithm that can be used to compensate for the impairments which have an imprint on a signal constellation, i.e. rotation and distortion of the constellation points. The EM is especially effective for combating non-linear phase noise (NLPN). It is because NLPN severely distorts the signal constellation and this can be tracked by the EM. The gain in the nonlinear system tolerance for the system under consideration is shown to be dependent on the transmission scenario. We show experimentally that for a dispersion managed polarization multiplexed 16-QAM system at 14 Gbaud a gain in the nonlinear system tolerance of up to 3 dB can be obtained. For, a dispersion unmanaged system this gain reduces to 0.5 dB.


international conference on machine learning | 2009

Function factorization using warped Gaussian processes

Mikkel N. Schmidt

We introduce a new approach to non-linear regression called function factorization, that is suitable for problems where an output variable can reasonably be modeled by a number of multiplicative interaction terms between non-linear functions of the inputs. The idea is to approximate a complicated function on a high-dimensional space by the sum of products of simpler functions on lower-dimensional subspaces. Function factorization can be seen as a generalization of matrix and tensor factorization methods, in which the data are approximated by the sum of outer products of vectors. We present a non-parametric Bayesian approach to function factorization where the priors over the factorizing functions are warped Gaussian processes, and we do inference using Hamiltonian Markov chain Monte Carlo. We demonstrate the superior predictive performance of the method on a food science data set compared to Gaussian process regression and tensor factorization using PARAFAC and GEMANOVA models.

Collaboration


Dive into the Mikkel N. Schmidt's collaboration.

Top Co-Authors

Avatar

Morten Mørup

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Tue Herlau

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jan Larsen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kristoffer Jon Albers

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Lars Kai Hansen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ole Winther

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Rasmus Røge

Technical University of Denmark

View shared research outputs
Researchain Logo
Decentralizing Knowledge