Sharon M. DeLand
Sandia National Laboratories
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sharon M. DeLand.
Reliability Engineering & System Safety | 2002
William L. Oberkampf; Sharon M. DeLand; Brian Milne Rutherford; Kathleen V. Diegert; Kenneth F. Alvin
Abstract This article develops a general framework for identifying error and uncertainty in computational simulations that deal with the numerical solution of a set of partial differential equations (PDEs). A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. Our view incorporates the modeling and simulation phases that are recognized in the systems engineering and operations research communities, but it adds phases that are specific to the numerical solution of PDEs. In each of these phases, general sources of uncertainty, both aleatory and epistemic, and error are identified. Our general framework is applicable to any numerical discretization procedure for solving ODEs or PDEs. To demonstrate this framework, we describe a system-level example: the flight of an unguided, rocket-boosted, aircraft-launched missile. This example is discussed in detail at each of the six phases of modeling and simulation. Two alternative models of the flight dynamics are considered, along with aleatory uncertainty of the initial mass of the missile and epistemic uncertainty in the thrust of the rocket motor. We also investigate the interaction of modeling uncertainties and numerical integration error in the solution of the ordinary differential equations for the flight dynamics.
40th Structures, Structural Dynamics, and Materials Conference and Exhibit | 1999
William L. OberkampfX; Sharon M. DeLand; Brian M. Rutherfordi; Kathleen V. Diegerttt; Kenneth F. Alvin
INTRODUCTION This paper develops a general methodology for estimating the total uncertainty in computational simulations that deal with the numerical solution of a system of partial differential equations. A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of -the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. In each of these phases, general sources of variability, uncertainty, and error are identified. Our general methodology is applicable to any discretization procedure for solving ordinary or partial differential equations. To demonstrate this methodology, we describe a system-level analysis of an unguided, rocket-boosted, aircraft-launched missile. In the conceptual modeling phase, a wide variety of analysis options are considered, but only one branch of the analysis is computed: rigid body flight dynamics. We choose two parameters as nondeterministic elements of the system: one has variability that is treated probabilistically and one has uncertainty that is represented as a set of possible alternatives. To illustrate mathematical modeling uncertainty, we pursue two models with differing levels of physics: a six-degree-of-freedom and a three-degree-of- freedom model. We also examine numerical solution error in the analysis, which is ubiquitous in computational simulations. Historically the primary method of evaluating the performance of a proposed system design has been to build the design and then test it in the use environment. This testing process is often iterative, as design flaws are sequentially discovered and corrected. The number of design-test iterations has been reduced with the advent of computer simulation through numerical solution of the mathematical equations describing the system behavior. Computational results can identify some flaws and they avoid the difficulty or safety issues involved in conducting certain types of physical tests. Examples include the atmospheric entry of a space probe into another planet, structural failure of a full-scale containment vessel of a nuclear power plant, failure of a bridge during an earthquake, and exposure of a nuclear weapon to certain types of accident environments. Modeling and simulation are valuable tools in assessing the survivability and vulnerability of complex systems to either natural, abnormal, and hostile events. However, there still remains the need to assess the accuracy of simulations by comparing computational predictions with experimental test data through the process known as validation of computational simulations. Experimental validation, however, is continually increasing in cost and time rcquircd to conduct the test. For these reason modeling and simulation must take increasing responsibility for the safety, performance, and reliability of many high consequence systems.
ieee international conference on technologies for homeland security | 2007
Jeanne M. Fair; Rene J. LeClaire; Michael L. Wilson; Alan L. Turk; Sharon M. DeLand; Dennis R. Powell; Perry Klare; Mary Ewers; Lori R. Dauelsberg; David Izraelevitz
Decision makers, faced with highly complex alternatives for protecting our nations critical infrastructures must understand the consequences of policy options before they enact solutions to prevent and mitigate disasters. An effective way to examine these tradeoffs is to use a computer simulation that integrates high level representations of each infrastructure, their interdependencies and reactions to a variety of potential disruptions. To address this need, the Critical Infrastructure Protection Decision Support System (CIPDSS) project, funded by the Department of Homeland Security Science and Technology Directorate (DHS S&T), has developed a decision support tool that provides insights to help decision makers make risk-informed decisions. With the addition of a disease progression simulation, the CIPDSS tool has a unique ability to provide a high-level, integrated analysis of a pandemic influenza outbreak while representing the impact on critical infrastructures. This simulation models the time-dependent evolution of the disease and can be calibrated to prior data or to other higher fidelity models as appropriate. Mitigation options such as the use of antivirals and vaccines as prophylaxis, treatment or some combination as well as quarantine options can be assessed. Special attention is given to impacts to the population through sickness, targeted quarantine, or fear-based self-isolation and the resulting impacts on critical infrastructure operations.
International Journal of Risk Assessment and Management | 2012
Jeanne M. Fair; Dennis R. Powell; Rene J. LeClaire; Leslie M. Moore; Michael L. Wilson; Lori R. Dauelsberg; Michael E. Samsa; Sharon M. DeLand; Gary B. Hirsch; Brian Bush
It has become critical to assess the potential range of consequences of a pandemic influenza outbreak given the uncertainty about its disease characteristics while investigating risks and mitigation strategies of vaccines, antivirals, and social distancing measures. Here, we use a simulation model and rigorous experimental design with sensitivity analysis that incorporates uncertainty in the pathogen behaviour and epidemic response to show the extreme variation in the consequences of a potential pandemic outbreak in the USA. Using sensitivity analysis we found the most important disease characteristics are the fraction of the transmission that occur prior to symptoms, the reproductive number, and the length of each disease stage. Using data from the historical pandemics and for potential viral evolution, we show that response planning may underestimate the pandemic consequences by a factor of two or more.
Other Information: PBD: 1 Mar 2000 | 2000
James D. Smith; Sharon M. DeLand
The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem.
Archive | 2000
William L. Oberkampf; Sharon M. DeLand; Brian Milne Rutherford; Kathleen V. Diegert; D. F. Alvin
Archive | 2001
William R. Cook; John M. Brabson; Sharon M. DeLand
Archive | 2005
Michael E. Samsa; Rashad Raynor; Sharon M. DeLand; Hyeung-Sik Jason Min; Dennis R. Powell; Walter E. Beyeler; Gary B. Hirsch; R.G. Whitfield; Jeanne M. Fair; Lori R. Dauelsberg; Brian Bush; Rene J. LeClaire
Archive | 2008
Dennis R. Powell; Sharon M. DeLand; Michael E. Samsa
Archive | 2018
Elizabeth James Kistin Keller; Elizabeth Roll; Munaf Syed Aamir; Diana L Bull; Sharon M. DeLand; Chad Haddal; Howard David Passell; John T. Foley; Amber Suzanne Harwell; Monique Otis; George A. Backus; Wendell B. Jones; Michael Greet Shander Bawden; Richard L. Craft; David Kistin; Jeffrey B. Martin; Bradley Robert McNicol; Michael Geoffrey Vannoni; Lawrence C. Trost; Jeffrey Y. Tsao; Karla Weaver