Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephan Mandt is active.

Publication


Featured researches published by Stephan Mandt.


computer vision and pattern recognition | 2017

Factorized Variational Autoencoders for Modeling Audience Reactions to Movies

Zhiwei Deng; Rajitha Navarathna; Peter W. Carr; Stephan Mandt; Yisong Yue; Iain A. Matthews; Greg Mori

Matrix and tensor factorization methods are often used for finding underlying low-dimensional patterns from noisy data. In this paper, we study non-linear tensor factorization methods based on deep variational autoencoders. Our approach is well-suited for settings where the relationship between the latent representation to be learned and the raw data representation is highly complex. We apply our approach to a large dataset of facial expressions of movie-watching audiences (over 16 million faces). Our experiments show that compared to conventional linear factorization methods, our method achieves better reconstruction of the data, and further discovers interpretable latent factors.


New Journal of Physics | 2015

Stochastic differential equations for quantum dynamics of spin-boson networks

Stephan Mandt; Darius Sadri; Andrew Houck; Hakan E. Türeci

The quantum dynamics of open many-body systems poses a challenge for computational approaches. Here we develop a stochastic scheme based on the positive P phase-space representation to study the nonequilibrium dynamics of coupled spin-boson networks that are driven and dissipative. Such problems are at the forefront of experimental research in cavity and solid state realizations of quantum optics, as well as cold atom physics, trapped ions and superconducting circuits. We demonstrate and test our method on a driven, dissipative two-site system, each site involving a spin coupled to a photonic mode, with photons hopping between the sites, where we find good agreement with Monte Carlo Wavefunction simulations. In addition to numerically reproducing features recently observed in an experiment [Phys. Rev. X 4, 031043 (2014)], we also predict a novel steady state quantum dynamical phase transition for an asymmetric configuration of drive and dissipation.


Machine Learning | 2017

Sparse probit linear mixed model

Stephan Mandt; Florian Wenzel; Shinichi Nakajima; John P. Cunningham; Christoph Lippert; Marius Kloft

Linear mixed models (LMMs) are important tools in statistical genetics. When used for feature selection, they allow to find a sparse set of genetic traits that best predict a continuous phenotype of interest, while simultaneously correcting for various confounding factors such as age, ethnicity and population structure. Formulated as models for linear regression, LMMs have been restricted to continuous phenotypes. We introduce the sparse probit linear mixed model (Probit-LMM), where we generalize the LMM modeling paradigm to binary phenotypes. As a technical challenge, the model no longer possesses a closed-form likelihood function. In this paper, we present a scalable approximate inference algorithm that lets us fit the model to high-dimensional data sets. We show on three real-world examples from different domains that in the setup of binary labels, our algorithm leads to better prediction accuracies and also selects features which show less correlation with the confounding factors.


european conference on machine learning | 2016

Huber-Norm Regularization for Linear Prediction Models

Oleksandr Zadorozhnyi; Gunthard Benecke; Stephan Mandt; Tobias Scheffer; Marius Kloft

In order to avoid overfitting, it is common practice to regularize linear prediction models using squared or absolute-value norms of the model parameters. In our article we consider a new method of regularization: Huber-norm regularization imposes a combination of


Physical Review A | 2013

Relaxation towards negative temperatures in bosonic systems: Generalized Gibbs ensembles and beyond integrability

Stephan Mandt; Adrian E. Feiguin; Salvatore R. Manmana


Physical Review A | 2014

Damping of Bloch oscillations: Variational solutions of the Boltzmann equation beyond linear response

Stephan Mandt

\ell _{1}


Journal of Machine Learning Research | 2017

Stochastic Gradient Descent as Approximate Bayesian Inference

Stephan Mandt; Matthew D. Hoffman; David M. Blei


neural information processing systems | 2016

Exponential Family Embeddings

Maja R. Rudolph; Francisco J. R. Ruiz; Stephan Mandt; David M. Blei

and


neural information processing systems | 2014

Smoothed Gradients for Stochastic Variational Inference

Stephan Mandt; David M. Blei


international conference on machine learning | 2016

A variational analysis of stochastic gradient algorithms

Stephan Mandt; Matthew D. Hoffman; David M. Blei

\ell _{2}

Collaboration


Dive into the Stephan Mandt's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cheng Zhang

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marius Kloft

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Florian Wenzel

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Hedvig Kjellström

Royal Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yisong Yue

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge