Proceedings of the Genetic and Evolutionary Computation Conference | 2019

Hyper-parameter tuning for the (1 + (λ, λ)) GA

 
 

Abstract


It is known that the (1 + (λ, λ)) Genetic Algorithm (GA) with self-adjusting parameter choices achieves a linear expected optimization time on OneMax if its hyper-parameters are suitably chosen. However, it is not very well understood how the hyper-parameter settings influences the overall performance of the (1 + (λ, λ)) GA. Analyzing such multi-dimensional dependencies precisely is at the edge of what running time analysis can offer. To make a step forward on this question, we present an in-depth empirical study of the self-adjusting (1 + (λ, λ)) GA and its hyper-parameters. We show, among many other results, that a 15% reduction of the average running time is possible by a slightly different setup, which allows non-identical offspring population sizes of mutation and crossover phase, and more flexibility in the choice of mutation rate and crossover bias --- a generalization which may be of independent interest. We also show indication that the parametrization of mutation rate and crossover bias derived by theoretical means for the static variant of the (1 + (λ, λ)) GA extends to the non-static case.

Volume None
Pages None
DOI 10.1145/3321707.3321725
Language English
Journal Proceedings of the Genetic and Evolutionary Computation Conference

Full Text