Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yanlai Chen is active.

Publication


Featured researches published by Yanlai Chen.


Journal of Scientific Computing | 2016

A Reduced Radial Basis Function Method for Partial Differential Equations on Irregular Domains

Yanlai Chen; Sigal Gottlieb; Alfa R. H. Heryudono; Akil Narayan

We propose and test the first Reduced Radial Basis Function Method for solving parametric partial differential equations on irregular domains. The two major ingredients are a stable Radial Basis Function (RBF) solver that has an optimized set of centers chosen through a reduced-basis-type greedy algorithm, and a collocation-based model reduction approach that systematically generates a reduced-order approximation whose dimension is orders of magnitude smaller than the total number of RBF centers. The resulting algorithm is efficient and accurate as demonstrated through two- and three-dimensional test problems.


Journal of Scientific Computing | 2013

Reduced Collocation Methods: Reduced Basis Methods in the Collocation Framework

Yanlai Chen; Sigal Gottlieb

In this paper, we present the first reduced basis method well-suited for the collocation framework. Two fundamentally different algorithms are presented: the so-called Least Squares Reduced Collocation Method (LSRCM) and Empirical Reduced Collocation Method (ERCM). This work provides a reduced basis strategy to practitioners who prefer a collocation, rather than Galerkin, approach. Furthermore, the empirical reduced collocation method eliminates a potentially costly online procedure that is needed for non-affine problems with Galerkin approach. Numerical results demonstrate the high efficiency and accuracy of the reduced collocation methods, which match or exceed that of the traditional reduced basis method in the Galerkin framework.


Journal of Scientific Computing | 2017

Offline-Enhanced Reduced Basis Method Through Adaptive Construction of the Surrogate Training Set

Jiahua Jiang; Yanlai Chen; Akil Narayan

The reduced basis method (RBM) is a popular certified model reduction approach for solving parametrized partial differential equations. One critical stage of the offline portion of the algorithm is a greedy algorithm, requiring maximization of an error estimate over parameter space. In practice this maximization is usually performed by replacing the parameter domain continuum with a discrete “training” set. When the dimension of parameter space is large, it is necessary to significantly increase the size of this training set in order to effectively search parameter space. Large training sets diminish the attractiveness of RBM algorithms since this proportionally increases the cost of the offline phase. In this work we propose novel strategies for offline RBM algorithms that mitigate the computational difficulty of maximizing error estimates over a training set. The main idea is to identify a subset of the training set, a “surrogate training set” (STS), on which to perform greedy algorithms. The STS we construct is much smaller in size than the full training set, yet our examples suggest that it is accurate enough to induce the solution manifold of interest at the current offline RBM iteration. We propose two algorithms to construct the STS: our first algorithm, the successive maximization method, is inspired by inverse transform sampling for non-standard univariate probability distributions. The second constructs an STS by identifying pivots in the Cholesky decomposition of an approximate error correlation matrix. We demonstrate the algorithm through numerical experiments, showing that it is capable of accelerating offline RBM procedures without degrading accuracy, assuming that the solution manifold has rapidly decaying Kolmogorov width.


SIAM/ASA Journal on Uncertainty Quantification | 2016

A Goal-Oriented Reduced Basis Methods-Accelerated Generalized Polynomial Chaos Algorithm

Jiahua Jiang; Yanlai Chen; Akil Narayan

The nonintrusive generalized polynomial chaos (gPC) method is a popular computational approach for solving partial differential equations with random inputs. The main hurdle preventing its efficient direct application for high-dimensional input parameters is that the size of many parametric sampling meshes grows exponentially in the number of inputs (the “curse of dimensionality). In this paper, we design a weighted version of the reduced basis method (RBM) for use in the nonintrusive gPC framework. We construct an RBM surrogate that can rigorously achieve a user-prescribed error tolerance and ultimately is used to more efficiently compute a gPC approximation nonintrusively. The algorithm is capable of speeding up traditional nonintrusive gPC methods by orders of magnitude without degrading accuracy, assuming that the solution manifold has low Kolmogorov width. Numerical experiments on our test problems show that the relative efficiency improves as the parametric dimension increases, demonstrating the p...


Journal of Scientific Computing | 2018

A Foreword to the Special Issue in Honor of Professor Bernardo Cockburn on His 60th Birthday: A Life Time of Discontinuous Schemings

Yanlai Chen; Bo Dong; Chi-Wang Shu

We present this special issue of the Journal of Scientific Computing to celebrate Bernardo Cockburn’s sixtieth birthday. The theme of this issue is discontinuous Galerkin methods, a hallmark of Bernardo’s distinguished professional career. This foreword provides an informal but rigorous account of what enabled Bernardo’s achievements, based on the concluding presentation he gave at the the IMA workshop “Recent Advances and Challenges in Discontinuous Galerkin Methods and Related Approaches” on July 1, 2017 which was widely deemed as the best lecture of his career so far.


Archive | 2017

Robust residual-based and residual-free greedy algorithms for reduced basis methods

Yanlai Chen; Jiahua Jiang; Akil Narayan


arXiv: Numerical Analysis | 2018

Certified reduced basis methods for fractional Laplace equations via extension.

Harbir Antil; Yanlai Chen; Akil Narayan


arXiv: Numerical Analysis | 2018

A robust error estimator and a residual-free error indicator for reduced basis methods.

Yanlai Chen; Jiahua Jiang; Akil Narayan


Mathematical Modelling and Numerical Analysis | 2018

Optimally convergent hybridizable discontinuous Galerkin method for fifth-order Korteweg-de Vries type equations

Bo Dong; Yanlai Chen; Jiahua Jiang


arXiv: Numerical Analysis | 2016

A goal-oriented RBM-Accelerated generalized polynomial chaos algorithm

Jiahua Jiang; Yanlai Chen; Akil Narayan

Collaboration


Dive into the Yanlai Chen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jiahua Jiang

University of Massachusetts Dartmouth

View shared research outputs
Top Co-Authors

Avatar

Sigal Gottlieb

University of Massachusetts Dartmouth

View shared research outputs
Top Co-Authors

Avatar

Alfa R. H. Heryudono

University of Massachusetts Dartmouth

View shared research outputs
Top Co-Authors

Avatar

Bo Dong

University of Massachusetts Dartmouth

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Harbir Antil

George Mason University

View shared research outputs
Researchain Logo
Decentralizing Knowledge