Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael S. Eldred is active.

Publication


Featured researches published by Michael S. Eldred.


Archive | 2011

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

Michael S. Eldred; Dena M. Vigil; Keith R. Dalbey; William J. Bohnhoff; Brian M. Adams; Laura Painton Swiler; Sophia Lefantzi; Patricia Diane Hough; John P. Eddy

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a e xible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for


AIAA Journal | 2008

Efficient Global Reliability Analysis for Nonlinear Implicit Performance Functions

Barron J. Bichon; Michael S. Eldred; Laura Painton Swiler; Sankaran Mahadevan; John McFarland

Many engineering applications are characterized by implicit response functions that are expensive to evaluate and sometimes nonlinear in their behavior, making reliability analysis difficult. This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space. The method begins with a Gaussian process model built from a very small number of samples, and then adaptively chooses where to generate subsequent samples to ensure that the model is accurate in the vicinity of the limit state. The resulting Gaussian process model is then sampled using multimodal adaptive importance sampling to calculate the probability of exceeding (or failing to exceed) the response level of interest. By locating multiple points on or near the limit state, more complex and nonlinear limit states can be modeled, leading to more accurate probability integration. By concentrating the samples in the area where accuracy is important (i.e., in the vicinity of the limit state), only a small number of true function evaluations are required to build a quality surrogate model. The resulting method is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions. This new method is applied to a collection of example problems including one that analyzes the reliability of a microelectromechanical system device that current available methods have difficulty solving either accurately or efficiently.


41st Aerospace Sciences Meeting and Exhibit | 2003

OVERVIEW OF MODERN DESIGN OF EXPERIMENTS METHODS FOR COMPUTATIONAL SIMULATIONS

Anthony A. Giunta; Steven F. Wojtkiewicz; Michael S. Eldred

The intent of this paper is to provide an overview of modern design of experiments (DOE) techniques that can be applied in computational engineering design studies. The term modern refers to DOE techniques specifically designed for use with deterministic computer simulations. In addition, this term is used to contrast classical DOE techniques that were developed for laboratory and field experiments that possess random error sources. Several types of modern DOE methods are described including pseudo-Monte Carlo sampling, quasi-Monte Carlo sampling, Latin hypercube sampling, orthogonal array sampling, and Hammersley sequence sampling.


50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference | 2009

Recent Advances in Non-Intrusive Polynomial Chaos and Stochastic Collocation Methods for Uncertainty Analysis and Design

Michael S. Eldred

Non-intrusive polynomial chaos expansion (PCE) and stochastic collocation (SC) methods are attractive techniques for uncertainty quantification (UQ) due to their strong mathematical basis and ability to produce functional representations of stochastic variability. PCE estimates coefficients for known orthogonal polynomial basis functions based on a set of response function evaluations, using sampling, linear regression, tensor-product quadrature, or Smolyak sparse grid approaches. SC, on the other hand, forms interpolation functions for known coefficients, and requires the use of structured collocation point sets derived from tensor product or sparse grids. When tailoring the basis functions or interpolation grids to match the forms of the input uncertainties, exponential convergence rates can be achieved with both techniques for a range of probabilistic analysis problems. In addition, analytic features of the expansions can be exploited for moment estimation and stochastic sensitivity analysis. In this paper, the latest ideas for tailoring these expansion methods to numerical integration approaches will be explored, in which expansion formulations are modified to best synchronize with tensor-product quadrature and Smolyak sparse grids using linear and nonlinear growth rules. The most promising stochastic expansion approaches are then carried forward for use in new approaches for mixed aleatory-epistemic UQ, employing second-order probability approaches, and design under uncertainty, employing bilevel, sequential, and multifidelity approaches.


10th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference | 2004

Second-Order Corrections for Surrogate-Based Optimization with Model Hierarchies

Michael S. Eldred; Anthony A. Giunta; S. Collis

Surrogate-based optimization methods have become established as effective techniques for engineering design problems through their ability to tame nonsmoothness and reduce computational expense. In recent years, supporting mathematical theory has been developed to provide the foundation of provable convergence for these methods. One of the requirements of this provable convergence theory involves consistency between the surrogate model and the underlying truth model that it approximates. This consistency can be enforced through a variety of correction approaches, and is particularly essential in the case of surrogate-based optimization with model hierarchies. First-order additive and multiplicative corrections currently exist which satisfy consistency in values and gradients between the truth and surrogate models at a single point. This paper demonstrates that first-order consistency can be insufficient to achieve acceptable convergence rates in practice and presents new second-order additive, multiplicative, and combined corrections which can significantly accelerate convergence. These second-order corrections may enforce consistency with either the actual truth model Hessian or its finite difference, quasi-Newton, or Gauss-Newton approximation.


49th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference <br> 16th AIAA/ASME/AHS Adaptive Structures Conference<br> 10t | 2008

Evaluation of Non-Intrusive Approaches for Wiener-Askey Generalized Polynomial Chaos

Michael S. Eldred; Clayton G. Webster; Paul G. Constantine

Polynomial chaos expansions (PCE) are an attractive technique for uncertainty quantification (UQ) due to their strong mathematical basis and ability to produce functional representations of stochastic variability. When tailoring the orthogonal polynomial bases to match the forms of the input uncertainties in a Wiener-Askey scheme, excellent convergence properties can be achieved for general probabilistic analysis problems. Non-intrusive PCE methods allow the use of simulations as black boxes within UQ studies, and involve the calculation of chaos expansion coefficients based on a set of response function evaluations. These methods may be characterized as being either Galerkin projection methods, using sampling or numerical integration, or regression approaches (also known as point collocation or stochastic response surfaces), using linear least squares. Numerical integration methods may be further categorized as either tensor product quadrature or sparse grid Smolyak cubature and as either isotropic or anisotropic. Experience with these approaches is presented for algebraic and PDE-based benchmark test problems, demonstrating the need for accurate, efficient coefficient estimation approaches that sca le for problems with significant numbers of random variables.


8th Symposium on Multidisciplinary Analysis and Optimization | 2000

IMPLEMENTATION OF A TRUST REGION MODEL MANAGEMENT STRATEGY IN THE DAKOTA OPTIMIZATION TOOLKIT

Anthony A. Giunta; Michael S. Eldred

A trust region-based optimization method has been incorporated into the DAKOTA optimization software toolkit. This trust region approach is designed to manage surrogate models of the objective and constraint functions during the optimization process. In this method, the surrogate functions are employed in a sequence of optimization steps, where the original expensive objective and constraint functions are used to update the surrogates during the optimization process. This sequential approximate optimization (SAO) strategy is demonstrated on two test cases, with comparisons to optimization results obtained with a quasi-Newton method. For both test cases the SAO strategy exhibits desirable convergence trends. In the first test case involving a smooth function, the SAO strategy converges to a slightly better minimum than the quasi-Newton method, although it uses twice as many function evaluations. In the second test case involving a function with many local minima, the SAO strategy generally finds better local minima than does the quasi-Newton method. The performance of the SAO strategy on this second test case demonstrates the utility of using this optimization method on engineering optimization problems, many of which contain multiple local optima.


Reliability Engineering & System Safety | 2011

Mixed Aleatory-Epistemic Uncertainty Quantification with Stochastic Expansions and Optimization-Based Interval Estimation.

Michael S. Eldred; Laura Painton Swiler; Gary Tang

Abstract Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster–Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable.


9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization | 2002

Formulations for Surrogate-Based Optimization Under Uncertainty

Michael S. Eldred; Anthony A. Giunta; Steven F. Wojtkiewicz; Timothy Trucano

In this paper, several formulations for optimization under uncertainty are presented. In addition to the direct nesting of uncertainty quantification within optimization, formulations are presented for surrogate-based optimization under uncertainty in which the surrogate model appears at the optimization level, at the uncertainty quantification level, or at both levels. These surrogate models encompass both data fit and hierarchical surrogates. The DAKOTA software framework is used to provide the foundation for prototyping and initial benchmarking of these formulations. A critical component is the extension of algorithmic techniques for deterministic surrogate-based optimization to these surrogate-based optimization under uncertainty formulations. This involves the use of sequential trust regionbased approaches to manage the extent of the approximations and verify the approximate optima. Two analytic test problems and one engineering problem are solved using the different methodologies in order to compare their relative merits. Results show that surrogate-based optimization under uncertainty formulations show promise both in reducing the number of function evaluations required and in mitigating the effects of nonsmooth response variations.


11th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference | 2006

Formulations for Surrogate-Based Optimization with Data Fit, Multifidelity, and Reduced-Order Models

Michael S. Eldred; Daniel M. Dunlavy

Surrogate-based optimization (SBO) methods have become established as effective techniques for engineering design problems through their ability to tame nonsmoothness and reduce computational expense. Possible surrogate modeling techniques include data fits (local, multipoint, or global), multifidelity model hierarchies, and reduced-order models, and each of these types has unique features when employed within SBO. This paper explores a number of SBO algorithmic variations and their effect for different surrogate modeling cases. First, general facilities for constraint management are explored through approximate subproblem formulations (e.g., direct surrogate), constraint relaxation techniques (e.g., homotopy), merit function selections (e.g., augmented Lagrangian), and iterate acceptance logic selections (e.g., filter methods). Second, techniques specialized to particular surrogate types are described. Computational results are presented for sets of algebraic test problems and an engineering design application solved using the DAKOTA software.

Collaboration


Dive into the Michael S. Eldred's collaboration.

Top Co-Authors

Avatar

Laura Painton Swiler

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Brian M. Adams

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

John Davis Jakeman

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Anthony A. Giunta

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Cosmin Safta

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Khachik Sargsyan

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Barron J. Bichon

Southwest Research Institute

View shared research outputs
Top Co-Authors

Avatar

Andrew G. Salinger

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Bert J. Debusschere

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge