David J. J. Toal
University of Southampton
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David J. J. Toal.
AIAA Journal | 2008
David J. J. Toal; Neil W. Bressloff; Andy J. Keane
Response surfaces have been extensively used as a method of building effective surrogate models of high-fidelity computational simulations. Of the numerous types of response surface models, kriging is perhaps one of the most effective, due to its ability to model complicated responses through interpolation or regression of known data while providing an estimate of the error in its prediction. There is, however, little information indicating the extent to which the hyperparameters of a kriging model need to be tuned for the resulting surrogate model to be effective. The following paper addresses this issue by investigating how often and how well it is necessary to tune the hyperparameters of a kriging model as it is updated during an optimization process. To this end, an optimization benchmarking procedure is introduced and used to assess the performance of five different tuning strategies over a range of problem sizes. The results of this benchmark demonstrate the performance gains that can be associated with reducing the complexity of the hyperparameter tuning process for complicated design problems. The strategy of tuning hyperparameters only once after the initial design of experiments is shown to perform poorly.
AIAA Journal | 2010
David J. J. Toal; Neil W. Bressloff; Andy J. Keane; Carren Holden
When carrying out design searches, traditional variable screening techniques can find it extremely difficult to distinguish between important and unimportant variables. This is particularly true when only a small number of simulations is combined with a parameterization which results in a large number of variables of seemingly equal importance. Here the authors present a variable reduction technique which employs proper orthogonal decomposition to filter out undesirable or badly performing geometries from an optimization process. Unlike traditional screening techniques, the presented method operates at the geometric level instead of the variable level. The filtering process uses the designs which result from a geometry parameterization instead of the variables which control the parameterization. The method is shown to perform well in the optimization of a two dimensional airfoil for the minimization of drag to lift ratio, producing designs better than those resulting from traditional kriging based surrogate model optimization and with a significant reduction in surrogate tuning cost
Journal of Aircraft | 2011
David J. J. Toal; Andy J. Keane
Multipoint objective functions are often employed within aerodynamic optimizations to prevent a reduction in offdesign performance. However, this typically results in the need for a significant number of simulations at a variety of design conditions to calculate the objective function. The following paper attempts to address this problem through the application of a multilevel cokriging model within the optimization process. A large number of single-point design simulations are augmented by a smaller number of multipoint simulations. The technique is shown to result in surrogate models as effective as those produced using a traditional multipoint process when optimizing a transonic airfoil but with a reduction in the total number of simulations
Engineering Optimization | 2011
David J. J. Toal; Neil W. Bressloff; Andy J. Keane; Carren Holden
Optimizations involving high-fidelity simulations can become prohibitively expensive when an exhaustive search is employed. To remove this expense a surrogate model is often constructed. One of the most popular techniques for the construction of such a surrogate model is that of kriging. However, the construction of a kriging model requires the optimization of a multi-model likelihood function, the cost of which can approach that of the high-fidelity simulations upon which the model is based. The article describes the development of a hybridized particle swarm algorithm which aims to reduce the cost of this likelihood optimization by drawing on an efficient adjoint of the likelihood. This hybridized tuning strategy is compared to a number of other strategies with respect to the inverse design of an airfoil as well as the optimization of an airfoil for minimum drag at a fixed lift.
Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences | 2009
David J. J. Toal; Alexander I. J. Forrester; Neil W. Bressloff; Andy J. Keane; Carren Holden
The process of likelihood maximization can be found in many different areas of computational modelling. However, the construction of such models via likelihood maximization requires the solution of a difficult multi-modal optimization problem involving an expensive O(n3) factorization. The optimization techniques used to solve this problem may require many such factorizations and can result in a significant bottleneck. This article derives an adjoint formulation of the likelihood employed in the construction of a kriging model via reverse algorithmic differentiation. This adjoint is found to calculate the likelihood and all of its derivatives more efficiently than the standard analytical method and can therefore be used within a simple local search or within a hybrid global optimization to accelerate convergence and therefore reduce the cost of the likelihood optimization.
Engineering Optimization | 2012
David J. J. Toal; Andy J. Keane
Traditional surrogate modelling techniques, such as kriging, have been employed quite effectively within design optimizations. However, such models can fail to reproduce non-stationary responses accurately. This article explores the application of non-stationary kriging to design optimization and attempts to determine its applicability with regard to the optimization of both stationary and non-stationary objective functions. A series of analytical test problems and an engineering design problem are used to compare the performance of non-stationary and adaptive partial non-stationary kriging to traditional stationary kriging.
26th AIAA Applied Aerodynamics Conference | 2008
David J. J. Toal; Neil W. Bressloff; Andy J. Keane
When carrying out design searches, traditional variable screening techniques can find it extremely difficult to distinguish between important and unimportant variables. This is particularly true when only a small number of simulations is combined with a parameterization which results in a large number of variables of seemingly equal importance. Here the authors present a variable reduction technique which employs proper orthogonal decomposition to filter out undesirable or badly performing geometries from an optimization process. Unlike traditional screening techniques, the presented method operates at the geometric level instead of the variable level. The filtering process uses the designs which result from a geometry parameterization instead of the variables which control the parameterization. The method is shown to perform well in the optimization of a two dimensional airfoil for the minimization of drag to lift ratio, producing designs better than those resulting from traditional kriging based surrogate model optimization and with a significant reduction in surrogate tuning cost.
Journal of Propulsion and Power | 2014
David J. J. Toal; Andy J. Keane; Diego Benito; Jeffery A. Dixon; Jingbin Yang; Matthew Price; Trevor T. Robinson; Alain Remouchamps; Norbert Kill
Traditionally, the optimization of a turbomachinery engine casing for tip clearance has involved either two-dimensional transient thermomechanical simulations or three-dimensional mechanical simulations. This paper illustrates that three-dimensional transient whole-engine thermomechanical simulations can be used within tip clearance optimizations and that the efficiency of such optimizations can be improved when a multifidelity surrogate modeling approach is employed. These simulations are employed in conjunction with a rotor suboptimization using surrogate models of rotor-dynamics performance, stress, mass and transient displacements, and an engine parameterization.
ASME Turbo Expo 2014: Turbine Technical Conference and Exposition | 2014
David J. J. Toal
Traditional multi-fidelity surrogate models require that the output of the low fidelity model be reasonably well correlated with the high fidelity model and will only predict scalar responses. The following paper explores the potential of a novel multi-fidelity surrogate modelling scheme employing Gappy Proper Orthogonal Decomposition (G-POD) which is demonstrated to accurately predict the response of the entire computational domain thus improving optimization and uncertainty quantification performance over both traditional single and multi-fidelity surrogate modelling schemes
Engineering With Computers | 2016
David J. J. Toal
The surrogate modelling technique known as Kriging, and its various derivatives, requires an optimization process to effectively determine the model’s defining parameters. This optimization typically involves the maximisation of a likelihood function which requires the construction and inversion of a correlation matrix dependent on the selected modelling parameters. The construction of such models in high dimensions and with a large numbers of sample points can, therefore, be considerably expensive. Similarly, once such a model has been constructed the evaluation of the predictor, error and other related design and model improvement criteria can also be costly. The following paper investigates the potential for graphical processing units to be used to accelerate the evaluation of the Kriging likelihood, predictor and error functions. Five different Kriging formulations are considered including, ordinary, universal, non-stationary, gradient-enhanced and multi-fidelity Kriging. Other key contributions include the derivation of the adjoint of the likelihood function for a fully and partially gradient-enhanced Kriging model as well as the presentation of novel schemes to accelerate the likelihood optimization via a mixture of single and double precision calculations and by automatically selecting the best hardware to perform the evaluations on.