Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Richard V. Field is active.

Publication


Featured researches published by Richard V. Field.


Journal of Vibration and Acoustics | 2000

Estimating the probability distribution of von Mises stress for structures undergoing random excitation

Daniel J. Segalman; Garth M. Reese; Richard V. Field; Clay Fulcher

The von Mises stress is often used as the metric for evaluating design margins, particularly for structures made of ductile materials. For deterministic loads, both static and dynamic, the calculation of von Mises stress is straightforward, as is the resulting calculation of reliability. For loads modeled as random processes, the task is different; the response to such loads is itself a random process and its properties must be determined in terms of those of both the loads and the system. This has been done in the past by Monte Carlo sampling of numerical realizations that reproduce the second order statistics of the problem. Here, the authors present a method that provides analytic expressions for the probability distributions of von Mises stress which can be evaluated efficiently and with good precision numerically. Further, this new approach has the important advantage of providing the asymptotic properties of the probability distribution.


Journal of Vibration and Control | 1996

Methods to Compute Probabilistic Measures of Robustness for Structural Systems

Richard V. Field; Petros G. Voulgaris; Lawrence A. Bergman

Model uncertainty, if ignored, can seriously degrade the performance of an otherwise well-designed control system. If the level of this uncertainty is extreme, the system may even be driven to instability. In the context of structural control, performance degradation and instability imply excessive vibration or even structural failure. Robust control has typically been applied to the issue of model uncertainty through worst- case analyses. These traditional methods include the use of the structured singular value (μ-analysis), as applied to the small gain condition, to provide estimates of controller robustness. However, this emphasis on the worst-case scenario has not allowed a probabilistic understanding of robust control. Because of this, an attempt to view controller robustness as a probability measure is presented. As a result, a much more intuitive insight into controller robustness can be obtained. In this context, the joint probability distribution is of dimension equal to the number of uncertain parameters, and the failure hypersurface is defined by the onset of instability of the closed-loop system in the eigenspace. A first-order reliability measure (FORM) of the system is computed and used to estimate controller robustness. It is demonstrated via an example that this FORM method can provide accurate results on the probability of failure despite the potential complexity of the closed-loop. In addition to the FORM method, a probabilistic measure of robustness is developed based on the fundamentals of μ-analysis. It is shown that the μ-analysis based method is inferior to the FORM method and can only have qualitative value when assessing control system robustness in a probabilistic framework.


The Shock and Vibration Digest | 2000

A tutorial on design analysis for random vibration

Garth M. Reese; Richard V. Field; Daniel J. Segalman

The von Mises stress is often used as the metric for evaluating design margins, particularly for structures made of ductile materials. While computing the von Mises stress distribution in a structural system due to a deterministic load condition may be straightforward, difficulties arise when considering random vibration environments. As a result, alternate methods are used in practice. One such method involves resolving the random vibration environment to an equivalent static load. This technique, however, is only appropriate for a very small class of problems and can easily be used incorrectly. Monte Carlo sampling of numerical realizations that reproduce the second order statistics of the input is another method used to address this problem. This technique proves computationally inefficient and provides no insight as to the character of the distribution of von Mises stress. This tutorial describes a new methodology to investigate the design reliability of structural systems in a random vibration environment. The method provides analytic expressions for root mean square (RMS) von Mises stress and for the probability distributions of von Mises stress which can be evaluated efficiently and with good numerical precision. Further, this new approach has the important advantage of providing the asymptotic properties of the probability distribution. A brief overview of the theoretical development of the methodology is presented, followed by detailed instructions on how to implement the technique on engineering applications. As an example, the method is applied to a complex finite element model of a Global Positioning Satellite (GPS) system. This tutorial presents an efficient and accurate methodology for correctly applying the von Mises stress criterion to complex computational models. The von Mises criterion is the traditional method for determination of structural reliability issues in industry.


american control conference | 1997

Reliability-based covariance control design

Richard V. Field; Lawrence A. Bergman

An extension to classical covariance control methods, introduced by Skelton et al. (1985, 1989, 1995), is proposed specifically for application to the control of civil engineering structures subjected to random dynamic excitations. The covariance structure of the system is developed directly from specification of its reliability via the assumption of independent (Poisson) outcrossings of its stationary response process from a polyhedral safe region. This leads to a set of state covariance controllers, each of which guarantees that the closed-loop system will possess the specified level of reliability. An example civil engineering structure is considered.


Archive | 2011

Computational thermal, chemical, fluid, and solid mechanics for geosystems management.

Scott M Davison; Nicholas Alger; Daniel Zack Turner; Samuel R. Subia; Brian Carnes; Mario J. Martinez; Patrick K. Notz; Katherine A. Klise; Charles Michael Stone; Richard V. Field; Pania Newell; Carlos F. Jove-Colon; John R. Red-Horse; Joseph E. Bishop; Thomas A. Dewers; Polly L. Hopkins; Mikhail Mesh; James E. Bean; Harry K. Moffat; Hongkyu Yoon

This document summarizes research performed under the SNL LDRD entitled - Computational Mechanics for Geosystems Management to Support the Energy and Natural Resources Mission. The main accomplishment was development of a foundational SNL capability for computational thermal, chemical, fluid, and solid mechanics analysis of geosystems. The code was developed within the SNL Sierra software system. This report summarizes the capabilities of the simulation code and the supporting research and development conducted under this LDRD. The main goal of this project was the development of a foundational capability for coupled thermal, hydrological, mechanical, chemical (THMC) simulation of heterogeneous geosystems utilizing massively parallel processing. To solve these complex issues, this project integrated research in numerical mathematics and algorithms for chemically reactive multiphase systems with computer science research in adaptive coupled solution control and framework architecture. This report summarizes and demonstrates the capabilities that were developed together with the supporting research underlying the models. Key accomplishments are: (1) General capability for modeling nonisothermal, multiphase, multicomponent flow in heterogeneous porous geologic materials; (2) General capability to model multiphase reactive transport of species in heterogeneous porous media; (3) Constitutive models for describing real, general geomaterials under multiphase conditions utilizing laboratory data; (4) General capability to couple nonisothermal reactive flow with geomechanics (THMC); (5) Phase behavior thermodynamics for the CO2-H2O-NaCl system. General implementation enables modeling of other fluid mixtures. Adaptive look-up tables enable thermodynamic capability to other simulators; (6) Capability for statistical modeling of heterogeneity in geologic materials; and (7) Simulator utilizes unstructured grids on parallel processing computers.


advances in social networks analysis and mining | 2016

On data collection, graph construction, and sampling in Twitter

Jeremy D. Wendt; Randy Wells; Richard V. Field; Sucheta Soundarajan

We present a detailed study on data collection, graph construction, and sampling in Twitter. We observe that sampling on semantic graphs (i.e., graphs with multiple edge types) presents fundamentally distinct challenges from sampling on traditional graphs. The purpose of our work is to present new challenges and initial solutions for sampling semantic graphs. Novel elements of our work include the following: (1) We provide a thorough discussion of problems encountered with naïve breadth-first search on semantic graphs. We argue that common sampling methods such as breadth-first search face specific challenges on semantic graphs that are not encountered on graphs with homogeneous edge types. (2) We present two competing methods for creating semantic graphs from data collects, corresponding to the interactions between sampling of different edge types. (3) We discuss new metrics specific to graphs with multiple edge types, and discuss the effect of the sampling method on these metrics. (4) We discuss issues and potential solutions pertaining to sampling semantic graphs.


International Journal for Uncertainty Quantification | 2013

Statistical surrogate models for prediction of high-consequence climate change

Paul G. Constantine; Richard V. Field; Mark B. Boslough

In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.


Monte Carlo Methods and Applications | 2013

An algorithm for on-the-fly generation of samples of non-stationary Gaussian processes based on a sampling theorem

Richard V. Field; Mircea Grigoriu; Clark R. Dohrmann

Abstract. A Monte Carlo algorithm is developed for generating samples of real-valued non-stationary Gaussian processes. The method is based on a generalized version of Shannons sampling theorem for bandlimited deterministic signals, as well as an efficient algorithm for generating conditional Gaussian variables. One feature of the method that is attractive for engineering applications involving stochastic loads is the ability of the algorithm to be implemented “on-the-fly” meaning that, given the value of the sample of the process at the current time step, it provides the value for the sample of the process at the next time step. Theoretical arguments are supported by numerical examples demonstrating the implementation, efficiency, and accuracy of the proposed Monte Carlo simulation algorithm.


advances in social networks analysis and mining | 2017

Temporal Anomaly Detection in Social Media

Jacek Skryzalin; Richard V. Field; Andrew N Fisher; Travis L. Bauer

In this work, we approach topic tracking and meme trending in social media with a temporal focus; rather than analyzing topics, we aim to identify time periods whose content differs significantly from normal. We detail two approaches. The first is an information-theoretic analysis of the distributions of terms emitted during each time period. In the second, we cluster the documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.


Archive | 2014

Parameterized Reduced Order Models Constructed Using Hyper Dual Numbers

Matthew Robert Brake; Jeffrey A. Fike; Sean D. Topping; Ryan Schultz; Richard V. Field; Nolan McPeek-Bechtold; Remi Philippe Michel Dingreville

In order to assess the predicted performance of a manufactured system, analysts must typically consider random variations (both geometric and material) in the development of a finite element model, instead of a single deterministic model of an idealized geometry. The incorporation of random variations, however, could potentially require the development of thousands of nearly identical solid geometries that must be meshed and separately analyzed, which would require an impractical number of man-hours to complete. This paper proposes a new approach to uncertainty quantification by developing parameterized reduced order models. These parameterizations are based upon Taylor series expansions of the system’s matrices about the ideal geometry, and a component mode synthesis representation for each linear substructure is used to form an efficient basis with which to study the system. The numerical derivatives required for the Taylor series expansions are obtained efficiently using hyper dual numbers, which enable the derivatives to be calculated precisely to within machine precision. The theory is applied to a stepped beam system in order to demonstrate proof of concept. The accuracy and efficiency of the method, as well as the level at which the parameterization is introduced, are discussed. Hyper dual numbers can be used to construct parameterized models both efficiently and accurately and constitute an appropriate methodology to account for perturbations in a structural system.

Collaboration


Dive into the Richard V. Field's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

John M Emery

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Joseph E. Bishop

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Thomas L. Paez

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Daniel J. Segalman

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Garth M. Reese

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Jay Carroll

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

John Red-Horse

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Clay Fulcher

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

James W. Foulk

Sandia National Laboratories

View shared research outputs
Researchain Logo
Decentralizing Knowledge