Aaron Klein
University of Freiburg
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Aaron Klein.
european conference on computer vision | 2018
Eddy Ilg; Özgün Çiçek; Silvio Galesso; Aaron Klein; Osama Makansi; Frank Hutter; Thomas Brox
Optical flow estimation can be formulated as an end-to-end supervised learning problem, which yields estimates with a superior accuracy-runtime tradeoff compared to alternative methodology. In this paper, we make such networks estimate their local uncertainty about the correctness of their prediction, which is vital information when building decisions on top of the estimations. For the first time we compare several strategies and techniques to estimate uncertainty in a large-scale computer vision task like optical flow estimation. Moreover, we introduce a new network architecture and loss function that enforce complementary hypotheses and provide uncertainty estimates efficiently with a single forward pass and without the need for sampling or ensembles. We demonstrate the quality of the uncertainty estimates, which is clearly above previous confidence measures on optical flow and allows for interactive frame rates.
Electronic Journal of Statistics | 2017
Aaron Klein; Stefan Falkner; Simon Bartels; Philipp Hennig; Frank Hutter
Bayesian optimization has become a successful tool for optimizing the hyperparameters of machine learning algorithms, such as support vector machines or deep neural networks. Despite its success, for large datasets, training and validating a single configuration often takes hours, days, or even weeks, which limits the achievable performance. To accelerate hyperparameter optimization, we propose a generative model for the validation error as a function of training set size, which is learned during the optimization process and allows exploration of preliminary configurations on small subsets, by extrapolating to the full dataset. We construct a Bayesian optimization procedure, dubbed Fabolas, which models loss and training time as a function of dataset size and automatically trades off high information gain about the global optimum against computational cost. Experiments optimizing support vector machines and deep neural networks show that Fabolas often finds high-quality solutions 10 to 100 times faster than other state-of-the-art Bayesian optimization methods or the recently proposed bandit strategy Hyperband.
neural information processing systems | 2015
Matthias Feurer; Aaron Klein; Katharina Eggensperger; Jost Tobias Springenberg; Manuel Blum; Frank Hutter
neural information processing systems | 2016
Jost Tobias Springenberg; Aaron Klein; Stefan Falkner; Frank Hutter
international conference on artificial intelligence and statistics | 2017
Aaron Klein; Stefan Falkner; Simon Bartels; Philipp Hennig; Frank Hutter
international conference on machine learning | 2016
Hector Mendoza; Aaron Klein; Matthias Feurer; Jost Tobias Springenberg; Frank Hutter
international conference on learning representations | 2017
Aaron Klein; Stefan Falkner; Jost Tobias Springenberg; Frank Hutter
Archive | 2018
Eddy Ilg; Özgün Çiçek; Silvio Galesso; Aaron Klein; Osama Makansi; Frank Hutter; Thomas Brox
international conference on machine learning | 2018
Stefan Falkner; Aaron Klein; Frank Hutter
arXiv: Learning | 2018
Arber Zela; Aaron Klein; Stefan Falkner; Frank Hutter