Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Laura Painton Swiler is active.

Publication


Featured researches published by Laura Painton Swiler.


new security paradigms workshop | 1998

A graph-based system for network-vulnerability analysis

Cynthia A. Phillips; Laura Painton Swiler

This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.


Archive | 2011

DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

Michael S. Eldred; Dena M. Vigil; Keith R. Dalbey; William J. Bohnhoff; Brian M. Adams; Laura Painton Swiler; Sophia Lefantzi; Patricia Diane Hough; John P. Eddy

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a e xible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for


darpa information survivability conference and exposition | 2001

Computer-attack graph generation tool

Laura Painton Swiler; Cynthia A. Phillips; David E. Ellis; Stefan Chakerian

This paper presents a tool for assessment of security attributes and vulnerabilities in computer networks. The tool generates attack graphs (Phillips and Swiler, 1998). Each node in the attack graph represents a possible attack state. Edges represent a change of state caused by a single action taken by the attacker or unwitting assistant, and are weighted by some metric (such as attacker effort or time to succeed). Generation of the attack graph requires algorithms that match information about attack requirements (specified in attack templates) to information about the network configuration and assumed attacker capabilities (attacker profile). The set of near-optimal shortest paths indicates the most exploitable components of the system configuration. This paper presents the status of the tool and discusses implementation issues, especially focusing on the data input needs and methods for eliminating redundant paths and nodes in the graph.


AIAA Journal | 2008

Efficient Global Reliability Analysis for Nonlinear Implicit Performance Functions

Barron J. Bichon; Michael S. Eldred; Laura Painton Swiler; Sankaran Mahadevan; John McFarland

Many engineering applications are characterized by implicit response functions that are expensive to evaluate and sometimes nonlinear in their behavior, making reliability analysis difficult. This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space. The method begins with a Gaussian process model built from a very small number of samples, and then adaptively chooses where to generate subsequent samples to ensure that the model is accurate in the vicinity of the limit state. The resulting Gaussian process model is then sampled using multimodal adaptive importance sampling to calculate the probability of exceeding (or failing to exceed) the response level of interest. By locating multiple points on or near the limit state, more complex and nonlinear limit states can be modeled, leading to more accurate probability integration. By concentrating the samples in the area where accuracy is important (i.e., in the vicinity of the limit state), only a small number of true function evaluations are required to build a quality surrogate model. The resulting method is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions. This new method is applied to a collection of example problems including one that analyzes the reliability of a microelectromechanical system device that current available methods have difficulty solving either accurately or efficiently.


Reliability Engineering & System Safety | 2009

Implementation and evaluation of nonparametric regression procedures for sensitivity analysis of computationally demanding models

Curtis B. Storlie; Laura Painton Swiler; Jon C. Helton; Cédric J. Sallaberry

The analysis of many physical and engineering problems involves running complex computational models (simulation models, computer codes). With problems of this type, it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model. In this presentation, an improvement on existing methods for SA of complex computer models is described for use when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations, a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the uncertainty in an input. Most existing approaches to this problem either do not work well with a large number of input variables and/or they ignore the error involved in estimating a sensitivity index. Here, a new approach to sensitivity index estimation using meta-models and bootstrap confidence intervals is described that provides solutions to these drawbacks. Further, an efficient yet effective approach to incorporate this methodology into an actual SA is presented. Several simulated and real examples illustrate the utility of this approach. This framework can be extended to uncertainty analysis as well.


Reliability Engineering & System Safety | 2006

Calibration, validation, and sensitivity analysis : What's what

Timothy G. Trucano; Laura Painton Swiler; Takera Igusa; William L. Oberkampf; Martin Pilch

Abstract One very simple interpretation of calibration is to adjust a set of parameters associated with a computational science and engineering code so that the model agreement is maximized with respect to a set of experimental data. One very simple interpretation of validation is to quantify our belief in the predictive capability of a computational code through comparison with a set of experimental data. Uncertainty in both the data and the code are important and must be mathematically understood to correctly perform both calibration and validation. Sensitivity analysis, being an important methodology in uncertainty analysis, is thus important to both calibration and validation. In this paper, we intend to clarify the language just used and express some opinions on the associated issues. We will endeavor to identify some technical challenges that must be resolved for successful validation of a predictive modeling capability. One of these challenges is a formal description of a “model discrepancy” term. Another challenge revolves around the general adaptation of abstract learning theory as a formalism that potentially encompasses both calibration and validation in the face of model uncertainty.


Reliability Engineering & System Safety | 2011

Mixed Aleatory-Epistemic Uncertainty Quantification with Stochastic Expansions and Optimization-Based Interval Estimation.

Michael S. Eldred; Laura Painton Swiler; Gary Tang

Abstract Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster–Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable.


AIAA Journal | 2008

Calibration and Uncertainty Analysis for Computer Simulations with Multivariate Output

John McFarland; Sankaran Mahadevan; Vicente J. Romero; Laura Painton Swiler

Model calibration analysis is concerned with the estimation of unobservable modeling parameters using observations of system response. When the model being calibrated is an expensive computer simulation, special techniques such as surrogate modeling and Bayesian inference are often fruitful. In this paper, we show how the flexibility of the Bayesian calibration approach can be exploited to account for a wide variety of uncertainty sources in the calibration process. We propose a straightforward approach for simultaneously handling Gaussian and non-Gaussian errors, as well as a framework for studying the effects of prescribed uncertainty distributions for model inputs that are not treated as calibration parameters. Further, we discuss how Gaussian process surrogate models can be used effectively when simulator response may be a function of time and/or space (multivariate output). The proposed methods are illustrated through the calibration of a simulation of thermally decomposing foam.


Journal of Computational Physics | 2015

Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

Aidan P. Thompson; Laura Painton Swiler; Christian Robert Trott; Stephen M. Foiles; Garritt J. Tucker

We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential 1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.


Structure and Infrastructure Engineering | 2006

The promise and peril of uncertainty quantification using response surface approximations

Anthony A. Giunta; John McFarland; Laura Painton Swiler; Michael S. Eldred

Conventional sampling-based uncertainty quantification (UQ) methods involve generating large numbers of random samples on input variables and calculating output statistics by evaluating the computational model for each set of samples. For real world applications, this method can be computationally prohibitive due to the cost of the model and the time required for each simulation run. Using response surface approximations may allow for the output statistics to be estimated more accurately when only a limited number of simulation runs are available. This paper describes an initial investigation into response surface based UQ using both kriging and multivariate adaptive regression spline surface approximation methods. In addition, the impact of two different data sampling methods, Latin hypercube sampling and orthogonal array sampling, is also examined. The data obtained from this study indicate that caution should be exercised when implementing response surface based methods for UQ using very low sample sizes. However, this study also shows that there are clear cases where response surface based UQ provides a gain in accuracy versus conventional sampling-based UQ methods.

Collaboration


Dive into the Laura Painton Swiler's collaboration.

Top Co-Authors

Avatar

Michael S. Eldred

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Brian M. Adams

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Vicente J. Romero

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Anthony A. Giunta

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Maoyi Huang

Pacific Northwest National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Jaideep Ray

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Angel Urbina

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Zhangshuan Hou

Pacific Northwest National Laboratory

View shared research outputs
Top Co-Authors

Avatar

John McFarland

Southwest Research Institute

View shared research outputs
Top Co-Authors

Avatar

Cynthia A. Phillips

Sandia National Laboratories

View shared research outputs
Researchain Logo
Decentralizing Knowledge