Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jon C. Helton is active.

Publication


Featured researches published by Jon C. Helton.


Reliability Engineering & System Safety | 2006

Survey of sampling-based methods for uncertainty and sensitivity analysis

Jon C. Helton; Jay D. Johnson; Cédric J. Sallaberry; Curtis B. Storlie

Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.


Reliability Engineering & System Safety | 1993

Uncertainty and sensitivity analysis techniques for use in performance assessment for radioactive waste disposal

Jon C. Helton

Abstract Uncertainty and sensitivity analysis techniques for use in performance assessments for radioactive waste disposal are reviewed. Summaries are given for the following techniques: differential analysis, Monte Carlo analysis, response surface methodology, and Fourier amplitude sensitivity test. Of these techniques, Monte Carlo analysis is felt to be the most widely applicable for use in performance assessment. Monte Carlo analysis involves five steps: (1) selection of a range and distribution for each input variable; (2) generation of a sample from the input variables; (3) propagation of the sample through the model under consideration; (4) performance of uncertainty analysis; and (5) performance of sensitivity analysis. These steps are discussed and illustrated with an analysis performed as part of a preliminary performance assessment for the Waste Isolation Pilot Plant (WIPP).


Reliability Engineering & System Safety | 2004

Challenge problems: uncertainty in system response given uncertain parameters

William L. Oberkampf; Jon C. Helton; Cliff Joslyn; Steven F. Wojtkiewicz; Scott Ferson

Abstract The risk assessment community has begun to make a clear distinction between aleatory and epistemic uncertainty in theory and in practice. Aleatory uncertainty is also referred to in the literature as variability, irreducible uncertainty, inherent uncertainty, and stochastic uncertainty. Epistemic uncertainty is also termed reducible uncertainty, subjective uncertainty, and state-of-knowledge uncertainty. Methods to efficiently represent, aggregate, and propagate different types of uncertainty through computational models are clearly of vital importance. The most widely known and developed methods are available within the mathematics of probability theory, whether frequentist or subjectivist. Newer mathematical approaches, which extend or otherwise depart from probability theory, are also available, and are sometimes referred to as generalized information theory (GIT). For example, possibility theory, fuzzy set theory, and evidence theory are three components of GIT. To try to develop a better understanding of the relative advantages and disadvantages of traditional and newer methods and encourage a dialog between the risk assessment, reliability engineering, and GIT communities, a workshop was held. To focus discussion and debate at the workshop, a set of prototype problems, generally referred to as challenge problems, was constructed. The challenge problems concentrate on the representation, aggregation, and propagation of epistemic uncertainty and mixtures of epistemic and aleatory uncertainty through two simple model systems. This paper describes the challenge problems and gives numerical values for the different input parameters so that results from different investigators can be directly compared.


Journal of Statistical Computation and Simulation | 1997

Uncertainty and sensitivity analysis in the presence of stochastic and subjective uncertainty

Jon C. Helton

Uncertainty and sensitivity analyses for systems that involve both stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty are discussed. In such analyses, the dependent variable is usually a complementary cumulative distribution function (CCDF) that arises from stochastic uncertainty; uncertainty analysis involves the determination of a distribution of CCDFs that results from subjective uncertainty, and sensitivity analysis involves the determination of the effects of subjective uncertainty in individual variables on this distribution of CCDFs. Uncertainty analysis is presented as an integration problem involving probability spaces for stochastic and subjective uncertainty. Approximation procedures for the underlying integrals are described that provide an assessment of the effects of stochastic uncertainty, an assessment of the effects of subjective uncertainty, and a basis for performing sensitivity studies. Extensive use is made of Latin hypercube sampling, importance sampling and reg...


Reliability Engineering & System Safety | 2004

An exploration of alternative approaches to the representation of uncertainty in model predictions

Jon C. Helton; Jay D. Johnson; William L. Oberkampf

Abstract Several simple test problems are used to explore the following approaches to the representation of the uncertainty in model predictions that derives from uncertainty in model inputs: probability theory, evidence theory, possibility theory, and interval analysis. Each of the test problems has rather diffuse characterizations of the uncertainty in model inputs obtained from one or more equally credible sources. These given uncertainty characterizations are translated into the mathematical structure associated with each of the indicated approaches to the representation of uncertainty and then propagated through the model with Monte Carlo techniques to obtain the corresponding representation of the uncertainty in one or more model predictions. The different approaches to the representation of uncertainty can lead to very different appearing representations of the uncertainty in model predictions even though the starting information is exactly the same for each approach. To avoid misunderstandings and, potentially, bad decisions, these representations must be interpreted in the context of the theory/procedure from which they derive.


Risk Analysis | 2002

Illustration of Sampling-Based Methods for Uncertainty and Sensitivity Analysis

Jon C. Helton; F. J. Davis

A sequence of linear, monotonic, and nonmonotonic test problems is used to illustrate sampling-based uncertainty and sensitivity analysis procedures. Uncertainty results obtained with replicated random and Latin hypercube samples are compared, with the Latin hypercube samples tending to produce more stable results than the random samples. Sensitivity results obtained with the following procedures and/or measures are illustrated and compared: correlation coefficients (CCs), rank correlation coefficients (RCCs), common means (CMNs), common locations (CLs), common medians (CMDs), statistical independence (SI), standardized regression coefficients (SRCs), partial correlation coefficients (PCCs), standardized rank regression coefficients (SRRCs), partial rank correlation coefficients (PRCCs), stepwise regression analysis with raw and rank-transformed data, and examination of scatter plots. The effectiveness of a given procedure and/or measure depends on the characteristics of the individual test problems, with (1) linear measures (i.e., CCs, PCCs, SRCs) performing well on the linear test problems, (2) measures based on rank transforms (i.e., RCCs, PRCCs, SRRCs) performing well on the monotonic test problems, and (3) measures predicated on searches for nonrandom patterns (i.e., CMNs, CLs, CMDs, SI) performing well on the nonmonotonic test problems.


Reliability Engineering & System Safety | 2009

Implementation and evaluation of nonparametric regression procedures for sensitivity analysis of computationally demanding models

Curtis B. Storlie; Laura Painton Swiler; Jon C. Helton; Cédric J. Sallaberry

The analysis of many physical and engineering problems involves running complex computational models (simulation models, computer codes). With problems of this type, it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model. In this presentation, an improvement on existing methods for SA of complex computer models is described for use when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations, a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the uncertainty in an input. Most existing approaches to this problem either do not work well with a large number of input variables and/or they ignore the error involved in estimating a sensitivity index. Here, a new approach to sensitivity index estimation using meta-models and bootstrap confidence intervals is described that provides solutions to these drawbacks. Further, an efficient yet effective approach to incorporate this methodology into an actual SA is presented. Several simulated and real examples illustrate the utility of this approach. This framework can be extended to uncertainty analysis as well.


19th AIAA Applied Aerodynamics Conference | 2001

Mathematical representation of uncertainty

William L. Oberkampf; Jon C. Helton; Kari Sentz

As widely done in the risk assessment community, a distinction is made between aleatory (random) and epistemic (subjective) uncertainty in the modeling and simulation process. The nature of epistemic uncertainty is discussed, including (1) occurrence in parameters contained in mathematical models of a system and its environment, (2) limited knowledge or understanding of a physical process or interactions of processes in a system, and (3) limited knowledge for the estimation of the likelihood of event scenarios of a system. To clarify the options available for representation of epistemic uncertainty, an overview is presented of a hierarchy of theories of uncertainty. Modern theories of uncertainty can represent much weaker statements of knowledge and more diverse types of uncertainty than traditional probability theory. A promising new theory, evidence (Dempster-Shafer) theory, is discussed and applied to a simple system given by an algebraic equation with two uncertain parameters. Multiple sources of information are provided for each parameter, but each source only provides an interval value for each parameter. The uncertainty in the system response is estimated using probability theory and evidence theory. The resultant solutions are compared with regard to their assessment of the likelihood that the system response exceeds a specified failure level. In this example, a traditional application of probability theory results in a significantly lower estimate of risk of failure as compared to evidence theory. Strengths and weaknesses of evidence theory are discussed, and several important open issues are identified that must be addressed before evidence theory can be used successfully in engineering applications. * Distinguished Member Technical Staff, Associate Fellow t Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the U. S. Department of Energy under contract No. DEAC04-94AL85000. This paper is declared a work of the U.S. Government and is not subject to copyright protection in the United States. Kari Sentz Systems Science and Industrial Engineering State University of New York-Binghamton Binghamton, New York


Reliability Engineering & System Safety | 2005

A comparison of uncertainty and sensitivity analysis results obtained with random and Latin hypercube sampling

Jon C. Helton; Freddie J. Davis; Jay D. Johnson

Uncertainty and sensitivity analysis results obtained with random and Latin hypercube sampling are compared. The comparison uses results from a model for two-phase fluid flow obtained with three independent random samples of size 100 each and three independent Latin hypercube samples (LHSs) of size 100 each. Uncertainty and sensitivity analysis results with the two sampling procedures are similar and stable across the three replicated samples. Poor performance of regression-based sensitivity analysis procedures for some analysis outcomes results more from the inappropriateness of the procedure for the nonlinear relationships between model input and model results than from an inadequate sample size. Kendalls coefficient of concordance (KCC) and the top down coefficient of concordance (TDCC) are used to assess the stability of sensitivity analysis results across replicated samples, with the TDCC providing a more informative measure of analysis stability than KCC. A new sensitivity analysis procedure based on replicated samples and the TDCC is introduced.


Reliability Engineering & System Safety | 2008

Multiple predictor smoothing methods for sensitivity analysis : Description of techniques

Curtis B. Storlie; Jon C. Helton

The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

Collaboration


Dive into the Jon C. Helton's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jay D. Johnson

Science Applications International Corporation

View shared research outputs
Top Co-Authors

Avatar

Clifford W. Hansen

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

William L. Oberkampf

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

A.W. Shiver

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Melvin G. Marietta

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Peter N. Swift

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Curtis B. Storlie

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Ronald L. Iman

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Hong-Nian Jow

Sandia National Laboratories

View shared research outputs
Researchain Logo
Decentralizing Knowledge