Rommel G. Regis
Saint Joseph's University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Rommel G. Regis.
Journal of Global Optimization | 2005
Rommel G. Regis; Christine A. Shoemaker
abstractWe present a new strategy for the constrained global optimization of expensive black box functions using response surface models. A response surface model is simply a multivariate approximation of a continuous black box function which is used as a surrogate model for optimization in situations where function evaluations are computationally expensive. Prior global optimization methods that utilize response surface models were limited to box-constrained problems, but the new method can easily incorporate general nonlinear constraints. In the proposed method, which we refer to as the Constrained Optimization using Response Surfaces (CORS) Method, the next point for costly function evaluation is chosen to be the one that minimizes the current response surface model subject to the given constraints and to additional constraints that the point be of some distance from previously evaluated points. The distance requirement is allowed to cycle, starting from a high value (global search) and ending with a low value (local search). The purpose of the constraint is to drive the method towards unexplored regions of the domain and to prevent the premature convergence of the method to some point which may not even be a local minimizer of the black box function. The new method can be shown to converge to the global minimizer of any continuous function on a compact set regardless of the response surface model that is used. Finally, we considered two particular implementations of the CORS method which utilize a radial basis function model (CORS-RBF) and applied it on the box-constrained Dixon–Szegö test functions and on a simple nonlinearly constrained test function. The results indicate that the CORS-RBF algorithms are competitive with existing global optimization algorithms for costly functions on the box-constrained test problems. The results also show that the CORS-RBF algorithms are better than other algorithms for constrained global optimization on the nonlinearly constrained test problem.
Informs Journal on Computing | 2007
Rommel G. Regis; Christine A. Shoemaker
We introduce a new framework for the global optimization of computationally expensive multimodal functions when derivatives are unavailable. The proposed Stochastic Response Surface (SRS) Method iteratively utilizes a response surface model to approximate the expensive function and identifies a promising point for function evaluation from a set of randomly generated points, called candidate points. Assuming some mild technical conditions, SRS converges to the global minimum in a probabilistic sense. We also propose Metric SRS (MSRS), which is a special case of SRS where the function evaluation point in each iteration is chosen to be the best candidate point according to two criteria: the estimated function value obtained from the response surface model, and the minimum distance from previously evaluated points. We develop a global optimization version and a multistart local optimization version of MSRS. In the numerical experiments, we used a radial basis function (RBF) model for MSRS and the resulting algorithms, Global MSRBF and Multistart Local MSRBF, were compared to 6 alternative global optimization methods, including a multistart derivative-based local optimization method. Multiple trials of all algorithms were compared on 17 multimodal test problems and on a 12-dimensional groundwater bioremediation application involving partial differential equations. The results indicate that Multistart Local MSRBF is the best on most of the higher dimensional problems, including the groundwater problem. It is also at least as good as the other algorithms on most of the lower dimensional problems. Global MSRBF is competitive with the other alternatives on most of the lower dimensional test problems and also on the groundwater problem. These results suggest that MSRBF is a promising approach for the global optimization of expensive functions.
SIAM Journal on Scientific Computing | 2008
Stefan M. Wild; Rommel G. Regis; Christine A. Shoemaker
We present a new derivative-free algorithm, ORBIT, for unconstrained local optimization of computationally expensive functions. A trust-region framework using interpolating Radial Basis Function (RBF) models is employed. The RBF models considered often allow ORBIT to interpolate nonlinear functions using fewer function evaluations than the polynomial models considered by present techniques. Approximation guarantees are obtained by ensuring that a subset of the interpolation points is sufficiently poised for linear interpolation. The RBF property of conditional positive definiteness yields a natural method for adding additional points. We present numerical results on test problems to motivate the use of ORBIT when only a relatively small number of expensive function evaluations are available. Results on two very different application problems, calibration of a watershed model and optimization of a PDE-based bioremediation plan, are also encouraging and support ORBITs effectiveness on blackbox functions for which no special mathematical structure is known or available.
IEEE Transactions on Evolutionary Computation | 2004
Rommel G. Regis; Christine A. Shoemaker
We develop an approach for the optimization of continuous costly functions that uses a space-filling experimental design and local function approximation to reduce the number of function evaluations in an evolutionary algorithm. Our approach is to estimate the objective function value of an offspring by fitting a function approximation model over the k nearest previously evaluated points, where k=(d+1)(d+2)/2 and d is the dimension of the problem. The estimated function values are used to screen offspring to identify the most promising ones for function evaluation. To fit function approximation models, a symmetric Latin hypercube design (SLHD) is used to determine initial points for function evaluation. We compared the performance of an evolution strategy (ES) with local quadratic approximation, an ES with local cubic radial basis function (RBF) interpolation, an ES whose initial parent population comes from an SLHD, and a conventional ES. These algorithms were applied to a twelve-dimensional (12-D) groundwater bioremediation problem involving a complex nonlinear finite-element simulation model. The performances of these algorithms were also compared on the Dixon-Szego test functions and on the ten-dimensional (10-D) Rastrigin and Ackley test functions. All comparisons involve analysis of variance (ANOVA) and the computation of simultaneous confidence intervals. The results indicate that ES algorithms with local approximation were significantly better than conventional ES algorithms and ES algorithms initialized by SLHDs on all Dixon-Szego test functions except for Goldstein-Price. However, for the more difficult 10-D and 12-D functions, only the cubic RBF approach was successful in improving the performance of an ES. Moreover, the results also suggest that the cubic RBF approach is superior to the quadratic approximation approach on all test functions and the difference in performance is statistically significant for all test functions with dimension d/spl ges/4.
Journal of Global Optimization | 2007
Rommel G. Regis; Christine A. Shoemaker
We propose some strategies that can be shown to improve the performance of the radial basis function (RBF) method by Gutmann [J. Global optim. 19(3), 201–227 (2001a)] (Gutmann-RBF) and the RBF method by Regis and Shoemaker [J. Global optim. 31, 153–171 (2005)] (CORS–RBF) on some test problems when they are initialized by symmetric Latin hypercube designs (SLHDs). Both methods are designed for the global optimization of computationally expensive functions with multiple local optima. We demonstrate how the original implementation of Gutmann-RBF can sometimes converge slowly to the global minimum on some test problems because of its failure to do local search. We then propose Controlled Gutmann-RBF (CG-RBF), which is a modification of Gutmann-RBF where the function evaluation point in each iteration is restricted to a subregion of the domain centered around a global minimizer of the current RBF model. By varying the size of this subregion in different iterations, we ensure a better balance between local and global search. Moreover, we propose a complete restart strategy for CG-RBF and CORS-RBF whenever the algorithm fails to make any substantial progress after some threshold number of consecutive iterations. Computational experiments on the seven Dixon and Szegö [Towards Global optimization, pp. 1–13. North-Holland, Amsterdam (1978)] test problems and on nine Schoen [J. Global optim. 3, 133–137 (1993)] test problems indicate that the proposed strategies yield significantly better performance on some problems. The results also indicate that, for some fixed setting of the restart parameters, the two modified RBF algorithms, namely CG-RBF-Restart and CORS-RBF-Restart, are comparable on the test problems considered. Finally, we examine the sensitivity of CG-RBF-Restart and CORS-RBF-Restart to the restart parameters.
Computers & Operations Research | 2011
Rommel G. Regis
This paper presents a new algorithm for derivative-free optimization of expensive black-box objective functions subject to expensive black-box inequality constraints. The proposed algorithm, called ConstrLMSRBF, uses radial basis function (RBF) surrogate models and is an extension of the Local Metric Stochastic RBF (LMSRBF) algorithm by Regis and Shoemaker (2007a) [1] that can handle black-box inequality constraints. Previous algorithms for the optimization of expensive functions using surrogate models have mostly dealt with bound constrained problems where only the objective function is expensive, and so, the surrogate models are used to approximate the objective function only. In contrast, ConstrLMSRBF builds RBF surrogate models for the objective function and also for all the constraint functions in each iteration, and uses these RBF models to guide the selection of the next point where the objective and constraint functions will be evaluated. Computational results indicate that ConstrLMSRBF is better than alternative methods on 9 out of 14 test problems and on the MOPTA08 problem from the automotive industry (Jones, 2008 [2]). The MOPTA08 problem has 124 decision variables and 68 inequality constraints and is considered a large-scale problem in the area of expensive black-box optimization. The alternative methods include a Mesh Adaptive Direct Search (MADS) algorithm (Abramson and Audet, 2006 [3]; Audet and Dennis, 2006 [4]) that uses a kriging-based surrogate model, the Multistart LMSRBF algorithm by Regis and Shoemaker (2007a) [1] modified to handle black-box constraints via a penalty approach, a genetic algorithm, a pattern search algorithm, a sequential quadratic programming algorithm, and COBYLA (Powell, 1994 [5]), which is a derivative-free trust-region algorithm. Based on the results of this study, the results in Jones (2008) [2] and other approaches presented at the ISMP 2009 conference, ConstrLMSRBF appears to be among the best, if not the best, known algorithm for the MOPTA08 problem in the sense of providing the most improvement from an initial feasible solution within a very limited number of objective and constraint function evaluations.
Journal of Computational and Graphical Statistics | 2008
Nikolay Bliznyuk; David Ruppert; Christine A. Shoemaker; Rommel G. Regis; Stefan M. Wild; Pradeep Mugunthan
We presenta Bayesian approach to model calibration when evaluation of the model is computationally expensive. Here, calibration is a nonlinear regression problem: given a data vector Y corresponding to the regression model f(β), find plausible values of β. As an intermediate step, Y and f are embedded into a statistical model allowing transformation and dependence. Typically, this problem is solved by sampling from the posterior distribution of β given Y using MCMC. To reduce computational cost, we limit evaluation of f to a small number of points chosen on a high posterior density region found by optimization.Then,we approximate the logarithm of the posterior density using radial basis functions and use the resulting cheap-to-evaluate surface in MCMC.We illustrate our approach on simulated data for a pollutant diffusion problem and study the frequentist coverage properties of credible intervals. Our experiments indicate that our method can produce results similar to those when the true “expensive” posterior density is sampled by MCMC while reducing computational costs by well over an order of magnitude.
Engineering Optimization | 2013
Rommel G. Regis; Christine A. Shoemaker
This article presents the DYCORS (DYnamic COordinate search using Response Surface models) framework for surrogate-based optimization of HEB (High-dimensional, Expensive, and Black-box) functions that incorporates an idea from the DDS (Dynamically Dimensioned Search) algorithm. The iterate is selected from random trial solutions obtained by perturbing only a subset of the coordinates of the current best solution. Moreover, the probability of perturbing a coordinate decreases as the algorithm reaches the computational budget. Two DYCORS algorithms that use RBF (Radial Basis Function) surrogates are developed: DYCORS-LMSRBF is a modification of the LMSRBF algorithm while DYCORS-DDSRBF is an RBF-assisted DDS. Numerical results on a 14-D watershed calibration problem and on eleven 30-D and 200-D test problems show that DYCORS algorithms are generally better than EGO, DDS, LMSRBF, MADS with kriging, SQP, an RBF-assisted evolution strategy, and a genetic algorithm. Hence, DYCORS is a promising approach for watershed calibration and for HEB optimization.
Engineering Optimization | 2014
Rommel G. Regis
This article develops two new algorithms for constrained expensive black-box optimization that use radial basis function surrogates for the objective and constraint functions. These algorithms are called COBRA and Extended ConstrLMSRBF and, unlike previous surrogate-based approaches, they can be used for high-dimensional problems where all initial points are infeasible. They both follow a two-phase approach where the first phase finds a feasible point while the second phase improves this feasible point. COBRA and Extended ConstrLMSRBF are compared with alternative methods on 20 test problems and on the MOPTA08 benchmark automotive problem (D.R. Jones, Presented at MOPTA 2008), which has 124 decision variables and 68 black-box inequality constraints. The alternatives include a sequential penalty derivative-free algorithm, a direct search method with kriging surrogates, and two multistart methods. Numerical results show that COBRA algorithms are competitive with Extended ConstrLMSRBF and they generally outperform the alternatives on the MOPTA08 problem and most of the test problems.
Journal of Global Optimization | 2013
Rommel G. Regis; Christine A. Shoemaker
We present the AQUARS (A QUAsi-multistart Response Surface) framework for finding the global minimum of a computationally expensive black-box function subject to bound constraints. In a traditional multistart approach, the local search method is blind to the trajectories of the previous local searches. Hence, the algorithm might find the same local minima even if the searches are initiated from points that are far apart. In contrast, AQUARS is a novel approach that locates the promising local minima of the objective function by performing local searches near the local minima of a response surface (RS) model of the objective function. It ignores neighborhoods of fully explored local minima of the RS model and it bounces between the best partially explored local minimum and the least explored local minimum of the RS model. We implement two AQUARS algorithms that use a radial basis function model and compare them with alternative global optimization methods on an 8-dimensional watershed model calibration problem and on 18 test problems. The alternatives include EGO, GLOBALm, MLMSRBF (Regis and Shoemaker in INFORMS J Comput 19(4):497–509, 2007), CGRBF-Restart (Regis and Shoemaker in J Global Optim 37(1):113–135 2007), and multi level single linkage (MLSL) coupled with two types of local solvers: SQP and Mesh Adaptive Direct Search (MADS) combined with kriging. The results show that the AQUARS methods generally use fewer function evaluations to identify the global minimum or to reach a target value compared to the alternatives. In particular, they are much better than EGO and MLSL coupled to MADS with kriging on the watershed calibration problem and on 15 of the test problems.