The Distributed Genetic Algorithm Revisited
Abstract
This paper extends previous work done by Tanese on the distributed genetic algorithm (DGA). Tanese found that the DGA outperformed the canonical serial genetic algorithm (CGA) on a class of difficult, randomly-generated Walsh polynomials. This left open the question of whether the DGA would have similar success on functions that were more amenable to optimization by the CGA. In this work, experiments were done to compare the DGA's performance on the Royal Road class of fitness functions to that of the CGA. Besides achieving superlinear speedup on KSR parallel computers, the DGA again outperformed the CGA on the functions R3 and R4 with regard to the metrics of best fitness, average fitness, and number of times the optimum was reached. Its performance on R1 and R2 was comparable to that of the CGA. The effect of varying the DGA's migration parameters was also investigated. The results of the experiments are presented and discussed, and suggestions for future research are made.