Martin S. Krejca
Hasso Plattner Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Martin S. Krejca.
international symposium on algorithms and computation | 2015
Tobias Friedrich; Timo Kötzing; Martin S. Krejca; Andrew M. Sutton
Practical optimization problems frequently include uncertainty about the quality measure, for example due to noisy evaluations. Thus, they do not allow for a straightforward application of traditional optimization techniques. In these settings meta-heuristics are a popular choice for deriving good optimization algorithms, most notably evolutionary algorithms which mimic evolution in nature. Empirical evidence suggests that genetic recombination is useful in uncertain environments because it can stabilize a noisy fitness signal. With this paper we want to support this claim with mathematical rigor.
foundations of genetic algorithms | 2017
Martin S. Krejca; Carsten Witt
The Univariate Marginal Distribution Algorithm (UMDA), a popular estimation of distribution algorithm, is studied from a run time perspective. On the classical OneMax benchmark function, a lower bound of Ω(μ√n + n log n), where μ is the population size, on its expected run time is proved. This is the first direct lower bound on the run time of the UMDA. It is stronger than the bounds that follow from general black-box complexity theory and is matched by the run time of many evolutionary algorithms. The results are obtained through advanced analyses of the stochastic change of the frequencies of bit values maintained by the algorithm, including carefully designed potential functions. These techniques may prove useful in advancing the field of run time analysis for estimation of distribution algorithms in general.
IEEE Transactions on Evolutionary Computation | 2017
Tobias Friedrich; Timo Kötzing; Martin S. Krejca; Andrew M. Sutton
Practical optimization problems frequently include uncertainty about the quality measure, for example, due to noisy evaluations. Thus, they do not allow for a straightforward application of traditional optimization techniques. In these settings, randomized search heuristics such as evolutionary algorithms are a popular choice because they are often assumed to exhibit some kind of resistance to noise. Empirical evidence suggests that some algorithms, such as estimation of distribution algorithms (EDAs) are robust against a scaling of the noise intensity, even without resorting to explicit noise-handling techniques such as resampling. In this paper, we want to support such claims with mathematical rigor. We introduce the concept of graceful scaling in which the run time of an algorithm scales polynomially with noise intensity. We study a monotone fitness function over binary strings with additive noise taken from a Gaussian distribution. We show that myopic heuristics cannot efficiently optimize the function under arbitrarily intense noise without any explicit noise-handling. Furthermore, we prove that using a population does not help. Finally, we show that a simple EDA called the compact genetic algorithm can overcome the shortsightedness of mutation-only heuristics to scale gracefully with noise. We conjecture that recombinative genetic algorithms also have this property.
genetic and evolutionary computation conference | 2016
Duc-Cuong Dang; Tobias Friedrich; Timo Kötzing; Martin S. Krejca; Per Kristian Lehre; Pietro Simone Oliveto; Dirk Sudholt; Andrew M. Sutton
Population diversity is essential for the effective use of any crossover operator. We compare seven commonly used diversity mechanisms and prove rigorous run time bounds for the (μ+1) GA using uniform crossover on the fitness function Jumpk. All previous results in this context only hold for unrealistically low crossover probability pc=O(k/n), while we give analyses for the setting of constant pc < 1 in all but one case. Our bounds show a dependence on the problem size~
genetic and evolutionary computation conference | 2016
Tobias Friedrich; Timo Kötzing; Martin S. Krejca
n
parallel problem solving from nature | 2016
Duc-Cuong Dang; Tobias Friedrich; Timo Kötzing; Martin S. Krejca; Per Kristian Lehre; Pietro Simone Oliveto; Dirk Sudholt; Andrew M. Sutton
, the jump length k, the population size μ, and the crossover probability pc. For the typical case of constant k > 2 and constant pc, we can compare the resulting expected optimisation times for different diversity mechanisms assuming an optimal choice of μ: O}(nk-1) for duplicate elimination/minimisation, O}(n2 log n) for maximising the convex hull, O(n log n) for det. crowding (assuming pc = k/n), O(n log n) for maximising the Hamming distance, O(n log n) for fitness sharing, O(n log n) for the single-receiver island model. This proves a sizeable advantage of all variants of the (μ+1) GA compared to the (1+1) EA, which requires θ(nk). In a short empirical study we confirm that the asymptotic differences can also be observed experimentally.
Evolutionary Computation | 2016
Tobias Friedrich; Timo Kötzing; Martin S. Krejca; Andrew M. Sutton
Estimation of Distribution Algorithms (EDAs) work by iteratively updating a distribution over the search space with the help of samples from each iteration. Up to now, theoretical analyses of EDAs are scarce and present run time results for specific EDAs. We propose a new framework for EDAs that captures the idea of several known optimizers, including PBIL, UMDA, λ -MMASIB, cGA, and (1, λ)-EA. Our focus is on analyzing two core features of EDAs: a balanced EDA is sensitive to signals in the fitness; a stable EDA remains uncommitted under a biasless fitness function. We prove that no EDA can be both balanced and stable. The LeadingOnes function is a prime example where, at the beginning of the optimization, the fitness function shows no bias for many bits. Since many well-known EDAs are balanced and thus not stable, they are not well-suited to optimize LeadingOnes. We give a stable EDA which optimizes LeadingOnes within a time of O(n log n).
IEEE Transactions on Evolutionary Computation | 2018
Duc-Cuong Dang; Tobias Friedrich; Timo Kötzing; Martin S. Krejca; Per Kristian Lehre; Pietro Simone Oliveto; Dirk Sudholt; Andrew M. Sutton
Population diversity is essential for avoiding premature convergence in Genetic Algorithms (GAs) and for the effective use of crossover. Yet the dynamics of how diversity emerges in populations are not well understood. We use rigorous runtime analysis to gain insight into population dynamics and GA performance for a standard (\(\mu \)+1) GA and the \(\mathrm {Jump}_k\) test function. By studying the stochastic process underlying the size of the largest collection of identical genotypes we show that the interplay of crossover followed by mutation may serve as a catalyst leading to a sudden burst of diversity. This leads to improvements of the expected optimisation time of order \(\varOmega (n/\log n)\) compared to mutation-only algorithms like the (1+1) EA.
genetic and evolutionary computation conference | 2015
Tobias Friedrich; Timo Kötzing; Martin S. Krejca; Andrew M. Sutton
Recently, ant colony optimization ACO algorithms have proven to be efficient in uncertain environments, such as noisy or dynamically changing fitness functions. Most of these analyses have focused on combinatorial problems such as path finding. We rigorously analyze an ACO algorithm optimizing linear pseudo-Boolean functions under additive posterior noise. We study noise distributions whose tails decay exponentially fast, including the classical case of additive Gaussian noise. Without noise, the classical EA outperforms any ACO algorithm, with smaller being better; however, in the case of large noise, the EA fails, even for high values of which are known to help against small noise. In this article, we show that ACO is able to deal with arbitrarily large noise in a graceful manner; that is, as long as the evaporation factor is small enough, dependent on the variance of the noise and the dimension n of the search space, optimization will be successful. We also briefly consider the case of prior noise and prove that ACO can also efficiently optimize linear functions under this noise model.
genetic and evolutionary computation conference | 2018
Benjamin Doerr; Martin S. Krejca
Population diversity is essential for avoiding premature convergence in genetic algorithms (GAs) and for the effective use of crossover. Yet the dynamics of how diversity emerges in populations are not well understood. We use rigorous runtime analysis to gain insight into population dynamics and GA performance for the (