Pooriya Beyhaghi
University of California, San Diego
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Pooriya Beyhaghi.
Journal of Global Optimization | 2016
Pooriya Beyhaghi; Daniele Cavaglieri; Thomas R. Bewley
A new derivative-free optimization algorithm is introduced for nonconvex functions within a feasible domain bounded by linear constraints. Global convergence is guaranteed for twice differentiable functions with bounded Hessian, and is found to be remarkably efficient even for many functions which are not differentiable. Like other Response Surface Methods, at each optimization step, the algorithm minimizes a metric combining an interpolation of existing function evaluations and a model of the uncertainty of this interpolation. By adjusting the respective weighting of these two terms, the algorithm incorporates a tunable balance between global exploration and local refinement; a rule to adjust this balance automatically is also presented. Unlike other methods, any well-behaved interpolation strategy may be used. The uncertainty model is built upon the framework of a Delaunay triangulation of existing datapoints in parameter space. A quadratic function which goes to zero at each datapoint is formed within each simplex of this triangulation; the union of each of these quadratics forms the desired uncertainty model. Care is taken to ensure that function evaluations are performed at points that are well situated in parameter space; that is, such that the simplices of the resulting triangulation have circumradii with a known bound. This facilitates well-behaved local refinement as additional function evaluations are performed.
58th AIAA/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference | 2017
Shahrouz Ryan Alimo; Pooriya Beyhaghi; Gianluca Meneghello; Thomas R. Bewley
Delaunay-based derivative-free optimization leveraging global surrogates (∆-DOGS) is a recentlydeveloped optimization algorithm designed for nonsmooth functions in a handful of adjustable parameters. The first implementation of the original ∆-DOGS algorithm used polyharmonic splines to develop an inexpensive interpolating “surrogate” of the (expensive) function of interest. The behavior of this surrogate was found to be irregular in cases for which the function of interest turned out to be much more strongly dependent on some of the adjustable parameters than others. This irregularity of the surrogate led to the optimization algorithm requiring many more function evaluations than might have otherwise been necessary. In the present work, a modified interpolation strategy, dubbed multivariate adaptive polyharmonic splines (MAPS), is proposed to mitigate this irregular behavior, thereby accelerating the convergence of ∆-DOGS. The MAPS approach modifies the natural polyharmonic spline (NPS) approach by rescaling the parameters according to their significance in the optimization problem based on the data available at each iteration. This regularization of the NPS approach ultimately reduces the number of function evaluations required by ∆-DOGS to achieve a specified level of convergence in optimization problems characterized by parameters of varying degrees of significance. The importance of this rescaling of the parameters during the interpolation step is problem specific. To quantify its beneficial impact on a practical problem, we compare ∆-DOGS with MAPS to ∆-DOGS with NPS on an application related to hydrofoil shape optimization in seven parameters; results indicate a notable acceleration of convergence leveraging the MAPS approach.
Journal of Global Optimization | 2017
Pooriya Beyhaghi; Thomas R. Bewley
This paper introduces a modification of our original Delaunay-based optimization algorithm (developed in JOGO DOI:10.1007/s10898-015-0384-2) that reduces the number of function evaluations on the boundary of feasibility as compared with the original algorithm. A weaknesses we have identified with the original algorithm is the sometimes faulty behavior of the generated uncertainty function near the boundary of feasibility, which leads to more function evaluations along the boundary of feasibility than might otherwise be necessary. To address this issue, a second search function is introduced which has improved behavior near the boundary of the search domain. Additionally, the datapoints are quantized onto a Cartesian grid, which is successively refined, over the search domain. These two modifications lead to a significant reduction of datapoints accumulating on the boundary of feasibility, and faster overall convergence.
21st AIAA Computational Fluid Dynamics Conference | 2013
Daniele Cavaglieri; Pooriya Beyhaghi; Thomas R. Bewley
Implicit/Explicit (IMEX) Runge-Kutta (RK) schemes are effective for time-marching ODE systems with both stiff and nonstiff terms on the RHS; such schemes implement an (often A-stable or better) implicit RK scheme for the stiff part of the ODE, which is often linear, and, simultaneously, a (more convenient) explicit RK scheme for the nonstiff part of the ODE, which is often nonlinear. Low-storage RK schemes are especially effective for time-marching high-dimensional ODE discretizations of PDE systems on modern (cache-based) computational hardware, in which memory management is often the most significant computational bottleneck. In this paper, we develop one second-order, three thirdorder and one fourth-order IMEXRK schemes of the low-storage variety, all of which have the same or comparable low storage requirements, better stability properties, and either fewer or slightly more floating-point operations per timestep as the venerable (and, up to now, unique) second-order tworegister Crank-Nicolson/Runge-Kutta-Wray (CN/RKW3) IMEXRK algorithm that has dominated the DNS/LES literature for the last two decades.
arXiv: Statistics Theory | 2018
Pooriya Beyhaghi; Shahrouz Alimohammadi; Thomas R. Bewley
F1000Research | 2018
ryan alimo; Pooriya Beyhaghi; muhan zhao; Thomas R. Bewley
conference on decision and control | 2017
Shahrouz Ryan Alimo; Pooriya Beyhaghi; Thomas R. Bewley
Bulletin of the American Physical Society | 2016
Pooriya Beyhaghi
Bulletin of the American Physical Society | 2016
Shahrouz Alimohammadi; Daniele Cavaglieri; Pooriya Beyhaghi; Thomas R. Bewley
Bulletin of the American Physical Society | 2016
Gianluca Meneghello; Pooriya Beyhaghi; Thomas R. Bewley