Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nikolaus Hansen is active.

Publication


Featured researches published by Nikolaus Hansen.


parallel problem solving from nature | 2008

A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity

Raymond Ros; Nikolaus Hansen

This paper proposes a simple modification of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) for high dimensional objective functions, reducing the internal time and space complexity from quadratic to linear. The covariance matrix is constrained to be diagonal and the resulting algorithm, sep-CMA-ES, samples each coordinate independently. Because the model complexity is reduced, the learning rate for the covariance matrix can be increased. Consequently, on essentially separable functions, sep-CMA-ES significantly outperforms CMA-ES . For dimensions larger than a hundred, even on the non-separable Rosenbrock function, the sep-CMA-ES needs fewer function evaluations than CMA-ES .


Machine Learning | 2009

Efficient covariance matrix update for variable metric evolution strategies

Thorsten Suttorp; Nikolaus Hansen; Christian Igel

Randomized direct search algorithms for continuous domains, such as evolution strategies, are basic tools in machine learning. They are especially needed when the gradient of an objective function (e.g., loss, energy, or reward function) cannot be computed or estimated efficiently. Application areas include supervised and reinforcement learning as well as model selection. These randomized search strategies often rely on normally distributed additive variations of candidate solutions. In order to efficiently search in non-separable and ill-conditioned landscapes the covariance matrix of the normal distribution must be adapted, amounting to a variable metric method. Consequently, covariance matrix adaptation (CMA) is considered state-of-the-art in evolution strategies. In order to sample the normal distribution, the adapted covariance matrix needs to be decomposed, requiring in general Θ(n3) operations, where n is the search space dimension. We propose a new update mechanism which can replace a rank-one covariance matrix update and the computationally expensive decomposition of the covariance matrix. The newly developed update rule reduces the computational complexity of the rank-one covariance matrix adaptation to Θ(n2) without resorting to outdated distributions. We derive new versions of the elitist covariance matrix adaptation evolution strategy (CMA-ES) and the multi-objective CMA-ES. These algorithms are equivalent to the original procedures except that the update step for the variable metric distribution scales better in the problem dimension. We also introduce a simplified variant of the non-elitist CMA-ES with the incremental covariance matrix update and investigate its performance. Apart from the reduced time-complexity of the distribution update, the algebraic computations involved in all new algorithms are simpler compared to the original versions. The new update rule improves the performance of the CMA-ES for large scale machine learning problems in which the objective function can be evaluated fast.


symposium on experimental and efficient algorithms | 2009

Experimental Comparisons of Derivative Free Optimization Algorithms

A. Auger; Nikolaus Hansen; J. M. Perez Zerpa; Raymond Ros; M. Schoenauer

In this paper, the performances of the quasi-Newton BFGS algorithm, the NEWUOA derivative free optimizer, the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), the Differential Evolution (DE) algorithm and Particle Swarm Optimizers (PSO) are compared experimentally on benchmark functions reflecting important challenges encountered in real-world optimization problems. Dependence of the performances in the conditioning of the problem and rotational invariance of the algorithms are in particular investigated.


parallel problem solving from nature | 2008

Adaptive Encoding: How to Render Search Coordinate System Invariant

Nikolaus Hansen

This paper describes a method for rendering search coordinate system independent, Adaptive Encoding. Adaptive Encoding is applicable to any iterative search algorithm and employs incremental changes of the representation of solutions. One attractive way to change the representation in the continuous domain is derived from Covariance Matrix Adaptation (CMA). In this case, adaptive encoding recovers the CMA evolution strategy, when applied to an evolution strategy with cumulative step-size control. Consequently, adaptive encoding provides the means to apply CMA-like representation changes to any search algorithm in the continuous domain. Experimental results confirm the expectation that CMA-based adaptive encoding will generally speed up a typical evolutionary algorithm on non-separable, ill-conditioned problems by orders of magnitude.


genetic and evolutionary computation conference | 2009

Benchmarking the (1+1)-CMA-ES on the BBOB-2009 function testbed

Anne Auger; Nikolaus Hansen

The (1+1)-CMA-ES is an adaptive stochastic algorithm for the optimization of objective functions defined on a continuous search space in a black-box scenario. In this paper, an independent restart version of the (1+1)-CMA-ES is implemented and benchmarked on the BBOB-2009 noise-free testbed. The maximum number of function evaluations per run is set to 104 times the search space dimension. The algorithm solves 23, 13 and 12 of 24 functions in dimension 2, 10 and 40, respectively.


Archive | 2010

Real-Parameter Black-Box Optimization Benchmarking 2009: Experimental Setup

Nikolaus Hansen; Anne Auger; Raymond Ros


Archive | 2008

Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions

Nikolaus Hansen; Raymond Ros; Anne Auger


Archive | 2010

BBOB 2009: Comparison Tables of All Algorithms on All Noiseless Functions

Anne Auger; Steffen Finck; Nikolaus Hansen; Raymond Ros


Archive | 2007

PSO Facing Non-Separable and Ill-Conditioned Problems

Nikolaus Hansen; Raymond Ros; Nikolas Mauny; Marc Schoenauer; Anne Auger


Archive | 2011

Theory of Evolution Strategies: A New Perspective

Anne Auger; Nikolaus Hansen; Tao Team

Collaboration


Dive into the Nikolaus Hansen's collaboration.

Top Co-Authors

Avatar

Michèle Sebag

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Asma Atamna

University of Paris-Sud

View shared research outputs
Top Co-Authors

Avatar

Steffen Finck

Vorarlberg University of Applied Sciences

View shared research outputs
Top Co-Authors

Avatar

Olivier Teytaud

National University of Tainan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julien Perez

University of Paris-Sud

View shared research outputs
Researchain Logo
Decentralizing Knowledge