Patricia Diane Hough
Office of Scientific and Technical Information
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Patricia Diane Hough.
SIAM Journal on Scientific Computing | 2001
Patricia Diane Hough; Tamara G. Kolda; Virginia Torczon
We introduce a new asynchronous parallel pattern search (APPS). Parallel pattern search can be quite useful for engineering optimization problems characterized by a small number of variables (say, fifty or less) and by objective functions that are expensive to evaluate, such as those defined by complex simulations that can take anywhere from a few seconds to many hours to run. The target platforms for APPS are the loosely coupled parallel systems now widely available. We exploit the algorithmic characteristics of pattern search to design variants that dynamically initiate actions solely in response to messages, rather than routinely cycling through a fixed set of steps. This gives a versatile concurrent strategy that allows us to effectively balance the computational load across all available processors. Further, it allows us to incorporate a high degree of fault tolerance with almost no additional overhead. We demonstrate the effectiveness of a preliminary implementation of APPS on both standard test problems as well as some engineering optimization problems.
Archive | 2014
Brian M. Adams; Mohamed S. Ebeida; Michael S. Eldred; John Davis Jakeman; Laura Painton Swiler; John Adam Stephens; Dena M. Vigil; Timothy Michael Wildey; William J. Bohnhoff; John P. Eddy; Kenneth T. Hu; Keith R. Dalbey; Lara E Bauman; Patricia Diane Hough
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of Dakota’s iterative analysis capabilities. Dakota Version 6.1 Theory Manual generated on November 7, 2014
Other Information: PBD: 1 Jan 2000 | 2000
Patricia Diane Hough; Tamara G. Kolda; Virginia Torczon
Parallel pattern search (PPS) can be quite useful for engineering optimization problems characterized by a small number of variables (say 10--50) and by expensive objective function evaluations such as complex simulations that take from minutes to hours to run. However, PPS, which was originally designed for execution on homogeneous and tightly-coupled parallel machine, is not well suited to the more heterogeneous, loosely-coupled, and even fault-prone parallel systems available today. Specifically, PPS is hindered by synchronization penalties and cannot recover in the event of a failure. The authors introduce a new asynchronous and fault tolerant parallel pattern search (AAPS) method and demonstrate its effectiveness on both simple test problems as well as some engineering optimization problems
Archive | 2006
Joshua D. Griffin; Michael S. Eldred; Monica L. Martinez-Canales; Jean-Paul Watson; Tamara G. Kolda; Anthony A. Giunta; Brian M. Adams; Laura Painton Swiler; Pamela J. Williams; Patricia Diane Hough; Daniel M. Dunlavy; John P. Eddy; William Eugene Hart; Shannon L. Brown
The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogatebased optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user’s manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies. Dakota Version 6.11 User’s Manual generated on November 7, 2019
Archive | 2000
Tamara G. Kolda; Patricia Diane Hough
Archive | 2016
Brian M. Adams; Patricia Diane Hough; Lara E Bauman
Archive | 2016
Brian M. Adams; Patricia Diane Hough; John Adam Stephens
Archive | 2015
Brian M. Adams; Patricia Diane Hough; Laura Painton Swiler
Archive | 2013
Genetha Anne Gray; Patricia Diane Hough; Cesar Augusto Silva Monroy; Jean-Paul Watson; Robert B. Gramacy
Archive | 2011
Patricia Diane Hough; Laura Painton Swiler; Herbie Lee; Curtis B. Storlie