Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christine Zarges is active.

Publication


Featured researches published by Christine Zarges.


foundations of genetic algorithms | 2009

On the utility of the population size for inversely fitness proportional mutation rates

Christine Zarges

Artificial Immune Systems (AIS) are an emerging new field of research in Computational Intelligence that are used in many areas of application, e. g. optimization, anomaly detection and classification. For optimization tasks usually hypermutation operators are used. In this paper, we show that the use of populations can be essential for the utility of such operators by analyzing the runtime of a simple population-based immune inspired algorithm on a classical example problem. The runtime bounds we prove are tight for the problem at hand. Moreover, we derive some general characteristics of the considered mutation operator as well as properties of the population, which hold for a class of pseudo-Boolean functions.


Theoretical Computer Science | 2011

Analyzing different variants of immune inspired somatic contiguous hypermutations

Thomas Jansen; Christine Zarges

Artificial immune systems can be applied to a variety of very different tasks including function optimization. There are even artificial immune systems tailored specifically for this task. In spite of their successful application there is little knowledge and hardly any theoretical investigation about how and why they perform well. Here rigorous analyses for a specific class of mutation operators introduced for function optimization called somatic contiguous hypermutation is presented. Different concrete instantiations of this operator are considered and shown to behave quite differently in general. While there are serious limitations to the performance of this type of operator even for simple optimization tasks it is proven that for some types of optimization problems it performs much better than standard bit mutations most often used in evolutionary algorithms.


parallel problem solving from nature | 2008

Rigorous Runtime Analysis of Inversely Fitness Proportional Mutation Rates

Christine Zarges

Artificial Immune Systems (AIS) are an emerging new field of research in Computational Intelligence that are applied to many areas of application, e.g., optimization, anomaly detection and classification. For optimization tasks, the use of hypermutation operators constitutes a common concept in AIS. By now, only little theoretical work has been done in this field. In this paper, we present a detailed theoretical runtime analysis that gives an insight into the dynamics of fitness based hypermutation processes. Two specific mutation rates are considered using a simple immune inspired algorithm. Our main focus lies thereby on the influence of parameters embedded in popular immune inspired hypermutation operators from the literature. Our theoretical findings are accompanied by some empirical results.


international conference on artificial immune systems | 2011

On the analysis of the immune-inspired B-cell algorithm for the vertex cover problem

Thomas Jansen; Pietro Simone Oliveto; Christine Zarges

The runtime of the immune inspired B-Cell Algorithm (BCA) for the NP-hard vertex cover problem is analysed. It is the first theoretical analysis of a nature-inspired heuristic as used in practical applications for a realistic problem. Since the performance of BCA in combinatorial optimisation strongly depends on the representation an encoding heuristic is used. The BCA outperforms mutation-based evolutionary algorithms (EAs) on instance classes that are known to be hard for randomised search heuristics (RSHs). With respect to average runtime, it even outperforms a crossover-based EA on an instance class previously used to show good performance of crossover. These results are achieved by the BCA without needing a population. This shows contiguous somatic hypermutation as an alternative to crossover without having to control population size and diversity. However, it is also proved that populations are necessary for the BCA to avoid arbitrarily bad worst case approximation ratios.


Evolutionary Computation | 2013

Mutation rate matters even when optimizing monotonic functions

Benjamin Doerr; Thomas Jansen; Dirk Sudholt; Carola Winzen; Christine Zarges

Extending previous analyses on function classes like linear functions, we analyze how the simple (1+1) evolutionary algorithm optimizes pseudo-Boolean functions that are strictly monotonic. These functions have the property that whenever only 0-bits are changed to 1, then the objective value strictly increases. Contrary to what one would expect, not all of these functions are easy to optimize. The choice of the constant c in the mutation probability p(n)=c/n can make a decisive difference. We show that if c<1, then the (1+1) EA finds the optimum of every such function in iterations. For c=1, we can still prove an upper bound of O(n3/2). However, for , we present a strictly monotonic function such that the (1+1) EA with overwhelming probability needs iterations to find the optimum. This is the first time that we observe that a constant factor change of the mutation probability changes the runtime by more than a constant factor.


genetic and evolutionary computation conference | 2012

Fixed budget computations: a different perspective on run time analysis

Thomas Jansen; Christine Zarges

Randomised search heuristics are used in practice to solve difficult problems where no good problem-specific algorithm is known. They deliver a solution of acceptable quality in reasonable time in many cases. When theoretically analysing the performance of randomised search heuristics one usually considers the average time needed to find an optimal solution or one of a pre-specified approximation quality. This is very different from practice where usually the algorithm is stopped after some time. For a theoretical analysis this corresponds to investigating the quality of the solution obtained after a pre-specified number of function evaluations called budget. Such a perspective is taken here and two simple randomised search heuristics, random local search and the (1+1) evolutionary algorithm, are analysed on simple and well-known example functions. If the budget is significantly smaller than the expected time needed for optimisation the behaviour of the algorithms can be very different depending on the problem at hand. Precise analytical results are proven. They demonstrate novel and interesting challenges in the analysis of randomised search heuristics. The potential of this different perspective to provide a more practically useful theory is shown.


foundations of genetic algorithms | 2011

Analysis of evolutionary algorithms: from computational complexity analysis to algorithm engineering

Thomas Jansen; Christine Zarges

Analyzing the computational complexity of evolutionary algorithms has become an accepted and important branch in evolutionary computation theory. This is usually done by analyzing the (expected) optimization time measured by means of the number of function evaluations and describing its growth as a function of a measure for the size of the search space. Most often asymptotic results describing only the order of growth are derived. This corresponds to classical analysis of (randomized) algorithms in algorithmics. Recently, the emerging field of algorithm engineering has demonstrated that for practical purposes this analysis can be too coarse and more details of the algorithm and its implementation have to be taken into account in order to obtain results that are valid in practice. Using a very recent analysis of a simple evolutionary algorithm as starting point it is shown that the same holds for evolutionary algorithms. Considering this example it is demonstrated that counting function evaluations more precisely can lead to results contradicting actual run times. Motivated by these limitations of computational complexity analysis an algorithm engineering-like approach is presented.


international symposium on algorithms and computation | 2010

Analysis of an Iterated Local Search Algorithm for Vertex Coloring

Dirk Sudholt; Christine Zarges

Hybridizations of evolutionary algorithms and local search are among the best-performing algorithms for vertex coloring. However, the theoretical knowledge about these algorithms is very limited and it is agreed that a solid theoretical foundation is needed. We consider an iterated local search algorithm that iteratively tries to improve a coloring by applying mutation followed by local search. We investigate the capabilities and the limitations of this approach using bounds on the expected number of iterations until an optimal or near-optimal coloring is found. This is done for two different mutation operators and for different graph classes: bipartite graphs, sparse random graphs, and planar graphs.


genetic and evolutionary computation conference | 2009

Maximal age in randomized search heuristics with aging

Christian Horoba; Thomas Jansen; Christine Zarges

The concept of aging has been introduced and applied in many different variants in many different randomized search heuristics. The most important parameter is the maximal age of search points. Considering static pure aging known from artificial immune systems in the context of simple evolutionary algorithms, it is demonstrated that the choice of this parameter is both, crucial for the performance and difficult to set appropriately. The results are derived in a rigorous fashion and given as theorems with formal proofs. An additional contribution is the presentation of a general method to combine fitness functions into a function with stronger properties than its components. By application of this method we combine a function where the maximal age needs to be sufficiently large with a function where the maximal age needs to be sufficiently small. This yields a function where an appropriate age lies within a very narrow range.


parallel problem solving from nature | 2010

Optimizing monotone functions can be difficult

Benjamin Doerr; Thomas Jansen; Dirk Sudholt; Carola Winzen; Christine Zarges

Extending previous analyses on function classes like linear functions, we analyze how the simple (1+1) evolutionary algorithm optimizes pseudo-Boolean functions that are strictly monotone. Contrary to what one would expect, not all of these functions are easy to optimize. The choice of the constant c in the mutation probability p(n) = c/n can make a decisive difference. We show that if c > 1, then the (1+1) EA finds the optimum of every such function in Θ(n log n) iterations. For c = 1, we can still prove an upper bound of O(n3/2). However, for c > 33, we present a strictly monotone function such that the (1+1) EA with overwhelming probability does not find the optimum within 2Ω(n) iterations. This is the first time that we observe that a constant factor change of the mutation probability changes the run-time by more than constant factors.

Collaboration


Dive into the Christine Zarges's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dirk Sudholt

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christian Horoba

Technical University of Dortmund

View shared research outputs
Top Co-Authors

Avatar

Matthias Hebbel

Technical University of Dortmund

View shared research outputs
Top Co-Authors

Avatar

Melanie Schmidt

Technical University of Dortmund

View shared research outputs
Top Co-Authors

Avatar

Thorsten Kerkhof

Technical University of Dortmund

View shared research outputs
Top Co-Authors

Avatar

Walter Nistico

Technical University of Dortmund

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge