Appl. Soft Comput. | 2019

Handling expensive multi-objective optimization problems with a cluster-based neighborhood regression model

 
 
 

Abstract


Abstract This paper gives attention to multi-objective optimization in scenarios where objective function evaluation is expensive, that is, expensive multi-objective optimization. We firstly propose a cluster-based neighborhood regression model, which incorporates the linear regression technique to predict the descent direction and generate new potential offspring. Combining this model with the classical decomposition-based multi-objective optimization framework, we propose an efficient and effective algorithm for tackling computationally expensive multi-objective optimization problems. As opposed to the conventional approach of replacing the original time-consuming objective functions with the approximated ones obtained by surrogate model, the proposed algorithm incorporates the proposed regression model to serve as an operator producing higher-quality offspring so that the algorithm requires fewer iterations to reach a given solution quality. The proposed algorithm is compared with several state-of-the-art surrogate-assisted algorithms on a variety of well-known benchmark problems. Empirical results demonstrate that the proposed algorithm outperforms or is competitive with other peer algorithms, and has the ability to keep a good trade-off between solution quality and running time within a fairly small number of function evaluations. In particular, our proposed algorithm shows obvious superiority in terms of the computational time used for the algorithm components, and can obtain acceptable solutions for expensive problems with high efficiency.

Volume 80
Pages 211-225
DOI 10.1016/J.ASOC.2019.03.049
Language English
Journal Appl. Soft Comput.

Full Text