R. Lyndon While
University of Western Australia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by R. Lyndon While.
IEEE Transactions on Evolutionary Computation | 2006
Simon Huband; Philip Hingston; Luigi Barone; R. Lyndon While
When attempting to better understand the strengths and weaknesses of an algorithm, it is important to have a strong understanding of the problem at hand. This is true for the field of multiobjective evolutionary algorithms (EAs) as it is for any other field. Many of the multiobjective test problems employed in the EA literature have not been rigorously analyzed, which makes it difficult to draw accurate conclusions about the strengths and weaknesses of the algorithms tested on them. In this paper, we systematically review and analyze many problems from the EA literature, each belonging to the important class of real-valued, unconstrained, multiobjective test problems. To support this, we first introduce a set of test problem criteria, which are in turn supported by a set of definitions. Our analysis of test problems highlights a number of areas requiring attention. Not only are many test problems poorly constructed but also the important class of nonseparable problems, particularly nonseparable multimodal problems, is poorly represented. Motivated by these findings, we present a flexible toolkit for constructing well-designed test problems. We also present empirical results demonstrating how the toolkit can be used to test an optimizer in ways that existing test suites do not
IEEE Transactions on Evolutionary Computation | 2006
R. Lyndon While; Philip Hingston; Luigi Barone; Simon Huband
We present an algorithm for calculating hypervolume exactly, the Hypervolume by Slicing Objectives (HSO) algorithm, that is faster than any that has previously been published. HSO processes objectives instead of points, an idea that has been considered before but that has never been properly evaluated in the literature. We show that both previously studied exact hypervolume algorithms are exponential in at least the number of objectives and that although HSO is also exponential in the number of objectives in the worst case, it runs in significantly less time, i.e., two to three orders of magnitude less for randomly generated and benchmark data in three to eight objectives. Thus, HSO increases the utility of hypervolume, both as a metric for general optimization algorithms and as a diversity mechanism for evolutionary algorithms.
international conference on evolutionary multi criterion optimization | 2005
Simon Huband; Luigi Barone; R. Lyndon While; Philip Hingston
This paper presents a new toolkit for creating scalable multi-objective test problems. The WFG Toolkit is flexible, allowing characteristics such as bias, multi-modality, and non-separability to be incorporated and combined as desired. A wide variety of Pareto optimal geometries are also supported, including convex, concave, mixed convex/concave, linear, degenerate, and disconnected geometries. All problems created by the WFG Toolkit are well defined, are scalable with respect to both the number of objectives and the number of parameters, and have known Pareto optimal sets. Nine benchmark multi-objective problems are suggested, including one that is both multi-modal and non-separable, an important combination of characteristics that is lacking among existing (scalable) multi-objective problems.
IEEE Transactions on Evolutionary Computation | 2012
R. Lyndon While; Lucas Bradstreet; Luigi Barone
We describe a new algorithm WFG for calculating hypervolume exactly. WFG is based on the recently-described observation that the exclusive hypervolume of a point p relative to a set S is equal to the difference between the inclusive hypervolume of p and the hypervolume of S with each point limited by the objective values in p. WFG applies this technique iteratively over a set to calculate its hypervolume. Experiments show that WFG is substantially faster (in five or more objectives) than all previously-described algorithms that calculate hypervolume exactly.
IEEE Transactions on Evolutionary Computation | 2008
Lucas Bradstreet; R. Lyndon While; Luigi Barone
When hypervolume is used as part of the selection or archiving process in a multiobjective evolutionary algorithm, it is necessary to determine which solutions contribute the least hypervolume to a front. Little focus has been placed on algorithms that quickly determine these solutions and there are no fast algorithms designed specifically for this purpose. We describe an algorithm, IHSO, that quickly determines a solutions contribution. Furthermore, we describe and analyse heuristics that reorder objectives to minimize the work required for IHSO to calculate a solutions contribution. Lastly, we describe and analyze search techniques that reduce the amount of work required for solutions other than the least contributing one. Combined, these techniques allow multiobjective evolutionary algorithms to calculate hypervolume inline in increasingly complex and large fronts in many objectives.
congress on evolutionary computation | 2005
R. Lyndon While; Lucas Bradstreet; Luigi Barone; Philip Hingston
The fastest known algorithm for calculating the hypervolume of a set of solutions to a multi-objective optimization problem is the HSO algorithm (hypervolume by slicing objectives). However, the performance of HSO for a given front varies a lot depending on the order in which it processes the objectives in that front. We present and evaluate two alternative heuristics that each attempt to identify a good order for processing the objectives of a given front. We show that both heuristics make a substantial difference to the performance of HSO for randomly-generated and benchmark data in 5-9 objectives, and that they both enable HSO to reliably avoid the worst-case performance for those fronts. The enhanced HSO enable the use of hypervolume with larger populations in more objectives.
international conference on evolutionary multi criterion optimization | 2005
R. Lyndon While
We present a new analysis of the LebMeasure algorithm for calculating hypervolume. We prove that although it is polynomial in the number of points, LebMeasure is exponential in the number of objectives in the worst case, not polynomial as has been claimed previously. This result has important implications for anyone planning to use hypervolume, either as a metric to compare optimisation algorithms, or as part of a diversity mechanism in an evolutionary algorithm.
ieee international conference on evolutionary computation | 2006
Lucas Bradstreet; Luigi Barone; R. Lyndon While
When hypervolume is used as part of the selection or archiving process in a multi-objective evolutionary algorithm, the basic requirement is to choose a subset of the solutions in a non-dominated front such that the hypervolume of the subset is maximised. We describe and evaluate two algorithms to approximate this process: a greedy algorithm that assesses and eliminates solutions individually, and a local search algorithm that assesses entire subsets. We present empirical data which suggests that a hybrid approach is needed to get the best tradeoff between good results and computational cost.
congress on evolutionary computation | 2012
R. Lyndon While; Lucas Bradstreet
Hypervolume is increasingly being used in-line in multi-objective evolutionary algorithms, either to promote diversity, or in an archiving mechanism, or in the selection process. The usual requirement is to determine which point in a set contributes least to the hypervolume of the set, so that that point can be discarded. We describe a new exact algorithm IWFG for performing this calculation that combines two important features from other recent algorithms: the bounding trick from WFG, that accelerates calculations by generating lots of dominated points; and the best-first queuing mechanism from IHSO, that eliminates much of the calculation for most of the points in a set. Empirical results show that IWFG is significantly faster than IHSO on much experimental data in five or more objectives.
congress on evolutionary computation | 2007
Lucas Bradstreet; R. Lyndon While; Luigi Barone
Several multi-objective evolutionary algorithms compare the hypervolumes of different sets of points during their operation, usually for selection or archiving purposes. The basic requirement is to choose a subset of a front such that the hypervolume of that subset is maximised. We describe and evaluate three new algorithms based on incremental calculations of hypervolume using the new incremental hypervolume by slicing objectives (IHSO) algorithm: two greedy algorithms that respectively add or remove one point at a time from a front, and a local search that assesses entire subsets. Empirical evidence shows that using IHSO, the greedy algorithms are generally able to out-perform the local search and perform substantially better than previously published algorithms.